You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/03/17 16:32:06 UTC

Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #646

See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/646/display/redirect?page=changes>

Changes:

[noreply] Mapped JOB_STATE_RESOURCE_CLEANING_UP to State.RUNNING.

[ryanthompson591] fixed typo in typehints

[zyichi] Remove unused prebuild_sdk_container_base_iamge option from validate

[hengfeng] feat: add more custom metrics

[noreply] [BEAM-14103][Playgrounf][Bugfix] Fix google analytics id (#17092)

[noreply] Minor: Make ScopedReadStateSupplier final (#16992)

[noreply] [BEAM-14113] Improve SamzaJobServerDriver extensibility (#17099)

[noreply] [BEAM-14116] Chunk commit requests dynamically (#17004)

[noreply] Merge pull request #17079 from [BEAM-13660] Add types and queries in

[noreply] [BEAM-13888] Add unit testing to ioutilx (#17058)

[noreply] Merge pull request #16822 from [BEAM-13841][Playground] Add Application

[noreply] Minor: Make serializableCoder warning gramatically correct english

[noreply] [BEAM-14091] Fixing Interactive Beam show/collect for remote runners

[noreply] [BEAM-11934] Add enable_file_dynamic_sharding to allow DataflowRunner

[noreply] [BEAM-12777] Create symlink for `current` directory (#17105)

[noreply] [BEAM-14020] Adding SchemaTransform, SchemaTransformProvider,

[noreply] [BEAM-13015] Modify metrics to begin and reset to a non-dirty state.


------------------------------------------
[...truncated 57.07 KB...]
> Task :sdks:java:harness:jar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :sdks:java:expansion-service:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220317160300078338-9170'
 createTime: '2022-03-17T16:03:06.851624Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-17_09_03_06-16520541463599878889'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0317150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-17T16:03:06.851624Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_03_06-16520541463599878889]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_03_06-16520541463599878889
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_03_06-16520541463599878889 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:31.508Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.489Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.700Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.960Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.164Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.197Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.232Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.281Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.334Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.366Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.399Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.431Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.702Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.764Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.871Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.018Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.310Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.620Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.762Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.796Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:55.849Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.528Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.586Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:16.013Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.379Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.409Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_03_06-16520541463599878889 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 799b919e335e4da5b604bf26cf2e95f7 and timestamp: 1647533794.896252:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 118
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220317161639862367-7675'
 createTime: '2022-03-17T16:16:45.948426Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-17_09_16_45-14787157976749017395'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0317150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-17T16:16:45.948426Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_16_45-14787157976749017395]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_16_45-14787157976749017395
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_16_45-14787157976749017395 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:16:54.906Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:01.982Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.012Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.088Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.247Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.319Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.473Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.610Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.144Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.391Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.467Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.525Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.725Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.810Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.852Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.875Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.913Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.949Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.998Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.020Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.063Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.782Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.756Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.663Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:56.248Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.060Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.094Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_16_45-14787157976749017395 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_49e91cf1-c4de-45b6-a055-d0a8ceb6c521_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 39s
92 actionable tasks: 73 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ei2hbo4fxnbac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_PubsubIOIT_Python_Streaming #860

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/860/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #859

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/859/display/redirect?page=changes>

Changes:

[noreply] [GitHub Actions] - Verify Release Build Workflow  (#23390)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e1596578e0fd0bfac241db3dfb138bceb07b6f5b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e1596578e0fd0bfac241db3dfb138bceb07b6f5b # timeout=10
Commit message: "[GitHub Actions] - Verify Release Build Workflow  (#23390)"
 > git rev-list --no-walk 8e2431c0e55237af4bd00a9786e4c150e20d4e14 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8209363837533901480.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1016150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fjrxxanjsnlfo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #858

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/858/display/redirect?page=changes>

Changes:

[Moritz Mack] [Spark dataset runner] Add direct translation of Reshuffle and

[noreply] Make GCP OAuth scopes configurable via pipeline options. (#23644)

[noreply] Update BQIO to a single scheduled executor service reduce threads


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8e2431c0e55237af4bd00a9786e4c150e20d4e14 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8e2431c0e55237af4bd00a9786e4c150e20d4e14 # timeout=10
Commit message: "Update BQIO to a single scheduled executor service reduce threads (#23234)"
 > git rev-list --no-walk b784c988643b5c4ae2e25bb2d1d5317576b858ec # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1381995478477040812.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1015150436 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6vwxm5rupq3hu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #857

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/857/display/redirect?page=changes>

Changes:

[Kiley Sok] Add agent to open modules

[Kiley Sok] check for empty

[Kiley Sok] limit to jamm and update comments

[Kiley Sok] reuse options, pr comments

[rszper] Added content: The direct runner is not suited to production pipelines

[yixiaoshen] Remove artificial timeout in FirestoreV1IT, Dataflow runner is very slow

[Moritz Mack] Minor improvements to the tpcds gradle build for Spark

[Moritz Mack] Fix SparkSessionFactory to not fail when using Spark master local[*]

[Moritz Mack] Align translation logging for Spark dataset runner with rdd runner for

[noreply] Update

[noreply] Merge pull request #23524: Adding beam blog info to the Community page

[noreply] Update publish_release_notes to generate PR list (#23630)

[noreply] Bump Legacy dataflow container image tag (#23625)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b784c988643b5c4ae2e25bb2d1d5317576b858ec (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b784c988643b5c4ae2e25bb2d1d5317576b858ec # timeout=10
Commit message: "Merge pull request #23621: Minor improvements to the tpcds gradle build for Spark"
 > git rev-list --no-walk cc82c3201296f5243ec6935334165b0321d93891 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4239421924779187200.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1014150418 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v3uedzxlgto6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #856

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/856/display/redirect?page=changes>

Changes:

[noreply] Migrate GcsOptions#getExecutorService to an unbounded

[noreply] (BQ Java) Explicitly set coder for multi-partition batch load writes 

[noreply] Fix typo in bootstrap_beam_venv.py (#23574)

[noreply] Bump github.com/spf13/cobra from 1.5.0 to 1.6.0 in /sdks (#23591)

[noreply] [Playground][Tour Of Beam] Datastore entities split by origin (#23088)

[noreply] use write schema only for read api (#23594)

[noreply] [Go SDK]: SingleFlight bundle descriptor requests (#23589)

[noreply] Extend a timeout to create a bt cluster. (#23617)

[noreply] Use new github output format (#23624)

[noreply] Tour of Beam frontend state management (#23420) (#23572)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cc82c3201296f5243ec6935334165b0321d93891 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cc82c3201296f5243ec6935334165b0321d93891 # timeout=10
Commit message: "Tour of Beam frontend state management (#23420) (#23572)"
 > git rev-list --no-walk 1c1ecb2a36dfb256a0e5eb7ff9ac5f84601b51e2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4628287526190925814.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1013150431 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/g6nuqrxsfxzok

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #855

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/855/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Add a multi-process shared utility.

[Robert Bradshaw] Add fastener dependency.

[Robert Bradshaw] Refactor to have an explicit acquire/release API.

[Robert Bradshaw] Drop a TODO about deferred construction parameterization.

[Robert Bradshaw] Fix unused import/var.

[Moritz Mack] Replace website references to deprecated aws / kinesis modules with more

[Alexey Romanenko] [website][adhoc] Fix spellcheck errors and typos

[noreply] Add database role to SpannerConfig for role-based access control.

[noreply] Remove obsolete and deprecated bigquery native read. (#23557)

[noreply] Feature/name all java threads (#23387)

[noreply] [Go SDK] Don't construct plans in lock section. (#23583)

[noreply] Remove obsolete and deprecated bigquery native write. #23557 (#23558)

[noreply] Increase Python PostCommit timeout. (#23595)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1c1ecb2a36dfb256a0e5eb7ff9ac5f84601b51e2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1c1ecb2a36dfb256a0e5eb7ff9ac5f84601b51e2 # timeout=10
Commit message: "Merge pull request #23575: [website][adhoc] Fix spellcheck errors and typos"
 > git rev-list --no-walk 1d573e2f8801c9bf96a4a14a7897cae675360821 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5349632332885607494.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1012150737 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qlsubcxnq4eyu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #854

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/854/display/redirect?page=changes>

Changes:

[git] BEAM-13592 Add getOrderingKey in

[git] Add CHANGES entry

[git] Rename transform name according to review comment

[git] Update  to pass ordering key

[egalpin] Adds ordering key to OutgoingMessage builder, adds new coders to pubsub

[egalpin] Fixes pubsub bounded writer allowing for orderingKey

[egalpin] Alters order of pubsub message support in registrar

[egalpin] Removed publishTime and messageId in grpc pubsub client publish

[egalpin] Attempts to allow different pubsub root url for PubsubIO.Write

[egalpin] Fixes pubsub tests root url

[egalpin] Puts PubsubMessageCoder last in registrar

[egalpin] Uses MoreObjects over Objects

[egalpin] Renames PubsubMessageCoder to

[bulat.safiullin] [Website] update python-dependencies.md link #23478

[bulat.safiullin] [Website] update styles of iframe with video #23499

[bulat.safiullin] [Website] add version.html to shortcodes, update jet.md 22985

[Moritz Mack] Downgrade Scala version in Spark job-server to prevent Scala

[noreply] Support named databases in Firestore connector. Fix and enable Firestore

[noreply] [fixes #23000] Update the Python SDK harness state cache to be a loading

[noreply] Fix permission for Build python wheel branch_repo_nightly step (#23563)

[noreply] [Playground] complexity indicator (#23477)

[noreply] Reolling forward property-based tests for coders (#23425)

[noreply] Updated README for jupyterlab-sidepanel

[noreply] fix distribution example in golang guide (#23567)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1d573e2f8801c9bf96a4a14a7897cae675360821 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1d573e2f8801c9bf96a4a14a7897cae675360821 # timeout=10
Commit message: "Merge pull request #22216 from gemelen/beam-13592-pubsub-java-orderingkey"
 > git rev-list --no-walk b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3472055826150679903.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1011150503 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/aj54cyab74ffy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #853

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/853/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
Commit message: "Merge pull request #23547: update bom to the latest one."
 > git rev-list --no-walk b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2781840581994273718.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1010150426 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pha226dvkvn6q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #852

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/852/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
Commit message: "Merge pull request #23547: update bom to the latest one."
 > git rev-list --no-walk b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins860181229328134681.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1009150429 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/chm5n65nn4kuq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #851

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/851/display/redirect?page=changes>

Changes:

[Moritz Mack] Correctly detect retryable TransientKinesisExceptions (fixes #23517)

[noreply] Fix small error message typo

[noreply] Fixing right nav on Get Started page (#23543)

[noreply] Bump google.golang.org/grpc from 1.49.0 to 1.50.0 in /sdks (#23533)

[noreply] Merge pull request #23547: update bom to the latest one.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b94cff209cc8d1ae61cc916ff6b0b68561dc34c8 # timeout=10
Commit message: "Merge pull request #23547: update bom to the latest one."
 > git rev-list --no-walk fc6f400f9abbbe213b5573592cf7a938b5bf16d5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7190073281225676120.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1008150413 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bphst2uaetdsm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #850

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/850/display/redirect?page=changes>

Changes:

[toran.sahu] fix typo - s/befrehand/beforehand

[noreply] [Website] update links to https (#23523)

[noreply] Support custom avro DatumReader when reading from BigQuery (#22718)

[noreply] Rename 'clean' Gradle task that required Flutter and has been breaking

[noreply] Model handler unit test (#23506)

[noreply] Content/multi model pipelines (#23498)

[noreply] [Tour of Beam][Frontend] Content Tree and SDK models (#23316) (#23417)

[noreply] Fix bug where `astype(CategoricalDtype)` is rejected (#23513)

[noreply] Bump actions/stale from 5 to 6 (#23331)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fc6f400f9abbbe213b5573592cf7a938b5bf16d5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fc6f400f9abbbe213b5573592cf7a938b5bf16d5 # timeout=10
Commit message: "Bump actions/stale from 5 to 6 (#23331)"
 > git rev-list --no-walk c8075de3799a3443ec287cc4cbafe49fa6397e97 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7750031623089907971.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1007150424 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/buc4wwnwoacyy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #849

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/849/display/redirect?page=changes>

Changes:

[noreply] Add more typescript examples to the programming guide. (#23058)

[noreply] Merge pull request #23505: opt in for schema update. addresses #23504

[noreply] fix: only report backlog bytes on data records (#23493)

[noreply] Fix broken link in online clustering documentation (#23516)

[noreply] Grant actions using GITHUB_TOKEN the appropriate permission set (#23521)

[noreply] Fix failing Py37 BQ file loads test (#23334)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c8075de3799a3443ec287cc4cbafe49fa6397e97 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c8075de3799a3443ec287cc4cbafe49fa6397e97 # timeout=10
Commit message: "Fix failing Py37 BQ file loads test (#23334)"
 > git rev-list --no-walk b7b71361590b4fa8bac32a4541058bafdf0d1df1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3178645841622756262.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1006150444 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hq5shvrvepcf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #848

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/848/display/redirect?page=changes>

Changes:

[ningkang0957] Prep sidepanel 3.0.0 release

[noreply] Documented supported PyTorch versions (#22974)

[noreply] [Go SDK] Add fake impulse for inputs in Xlang Transform (#23383)

[noreply] Write permissions for issue closer/assigner

[noreply] GA Migration Adding Removal of /.m2/settings.xml (#23481)

[noreply] Bump google-cloud-spanner version for py containers (#23480)

[Moritz Mack] Ensure Java JMH benchmark tasks run sequentially to prevent failure when

[Moritz Mack] Fix validation of measurement name in InfluxDBPublisher (addresses

[noreply] group_id (#23445)

[noreply] Give issue tagger permission to write issues (#23485)

[noreply] Update UID (#23486)

[noreply] Improve error message in GcsUtil (#23482)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b7b71361590b4fa8bac32a4541058bafdf0d1df1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b7b71361590b4fa8bac32a4541058bafdf0d1df1 # timeout=10
Commit message: "Improve error message in GcsUtil (#23482)"
 > git rev-list --no-walk 72237d61baf39333db034607491ee2720708cf7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins925664673916669339.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1005150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ghkjyszytaly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #847

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/847/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] fix navbar footer overlap #22698

[noreply] [Website] Add new Java quickstart (#22747)

[Robert Bradshaw] Require time-bound flag for non-UW streaming Python jobs for new SDKs.

[noreply] Fix JdbcIOIT, which seems to have never worked (#21796)

[noreply] Support DECIMAL logical type in python SDK (#23014)

[noreply] AI/ML pipelines master page documentation (#23443)

[noreply] Fix go fmt error (#23474)

[noreply] Revert "Add drop_example flag to the RunInference and Model Handler


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 72237d61baf39333db034607491ee2720708cf7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 72237d61baf39333db034607491ee2720708cf7e # timeout=10
Commit message: "Revert "Add drop_example flag to the RunInference and Model Handler (#23266)" (#23392)"
 > git rev-list --no-walk 8ac77a99ba52e70f014db047dd961fdda598e001 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4353706877099509509.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1004150420 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 36s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4jdwbjlpeh43w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #846

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/846/display/redirect?page=changes>

Changes:

[noreply] [Tour Of Beam] return taskSnippetId/solutionSnippedId (#23419)

[noreply] Beam 21465 add requires stable input (#23230)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8ac77a99ba52e70f014db047dd961fdda598e001 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8ac77a99ba52e70f014db047dd961fdda598e001 # timeout=10
Commit message: "Beam 21465 add requires stable input (#23230)"
 > git rev-list --no-walk 3c7a3d40ce12eef5f4d361c67f1286b487847f65 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4317345647147958772.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1003150428 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/olk373asgbgb2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #845

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/845/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3c7a3d40ce12eef5f4d361c67f1286b487847f65 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3c7a3d40ce12eef5f4d361c67f1286b487847f65 # timeout=10
Commit message: "JdbcIO fetchSize can be set to Integer.MIN_VALUE (#23444)"
 > git rev-list --no-walk 3c7a3d40ce12eef5f4d361c67f1286b487847f65 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3005541339403567610.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1002150426 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/aezpofrsgismk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #844

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/844/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Batch encoding and decoding of schema data.

[Robert Bradshaw] Add microbenchmark for batch row encoding.

[Robert Bradshaw] Add batch testing for standard row coders.

[noreply] Relax `pip` check in setup.py to allow installation via other package

[noreply] replaced tabs with spaces in readme file (#23446)

[noreply] [Playground] [Backend] Adding the tags field to the example response

[noreply] [Playground] [Backend] Edited the function for getting executable name

[noreply] Fix type inference for set/delete attr. (#23242)

[noreply] Support VR test including TestStream for Spark runner in streaming mode

[noreply] Add cron job to trigger Java JMH micro-benchmarks weekly  (#23388)

[noreply] JdbcIO fetchSize can be set to Integer.MIN_VALUE (#23444)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3c7a3d40ce12eef5f4d361c67f1286b487847f65 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3c7a3d40ce12eef5f4d361c67f1286b487847f65 # timeout=10
Commit message: "JdbcIO fetchSize can be set to Integer.MIN_VALUE (#23444)"
 > git rev-list --no-walk f2d426d2d2c088a573ab5a96e6e6bfc1bbf45b21 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1001766748215315454.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb1001150421 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ekbivmvoln7gg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #843

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/843/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Python cross language docs.

[srohde] Add documentation link to the interactive environment

[noreply] Bump google.golang.org/api from 0.97.0 to 0.98.0 in /sdks (#23394)

[noreply] Increase Go Dataflow Postcommit timeout to 5h (#23423)

[noreply] [Playground] [Backend] Updating endpoints for playground examples

[noreply] Send JavaScript messages to Playground iframes when switching the

[noreply] [Playground] [Backend] Adding SDK to the example response (#22871)

[noreply] [Playground] [Backend] Removing the code related to the Cloud Storage

[noreply] [BEAM-10785] Change RowAsDictJsonCoder to not ensure ASCII while

[noreply] Update Python katas to latest version of EduTools and Beam 2.41 (#23180)

[noreply] RunInference Benchmarks UI (#23426)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f2d426d2d2c088a573ab5a96e6e6bfc1bbf45b21 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f2d426d2d2c088a573ab5a96e6e6bfc1bbf45b21 # timeout=10
Commit message: "RunInference Benchmarks UI (#23426)"
 > git rev-list --no-walk 7b8aa28e34a70a178bd569b28c3e0fcf9d87dd6b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1927767857381318113.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0930150424 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fuwstthwgzad2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #842

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/842/display/redirect?page=changes>

Changes:

[noreply] [Tour Of Beam] API adjustments (#23349)

[noreply] Adds support in Samza Runner to run DoFn.processElement in parallel

[noreply] Regenerate Go Protos (#23408)

[noreply] Support google-cloud-spanner v3 and fixes broken unit tests (#23365)

[noreply] Add relevant docs to Cloud Profiler exceptions. (#23404)

[noreply] Update state cache to not fail when measuring object sizes. (#23391)

[noreply] Fix Small pytorch notebook bug fix (#23407)

[noreply] PubsubIO - Improve limit validations to consider attributes (#23023)

[noreply] Example of Online Clustering  (#23289)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7b8aa28e34a70a178bd569b28c3e0fcf9d87dd6b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7b8aa28e34a70a178bd569b28c3e0fcf9d87dd6b # timeout=10
Commit message: "Example of Online Clustering  (#23289)"
 > git rev-list --no-walk 91d79d973a327b6d22314c8e28bf1b93bc608c2b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4306324860924646446.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0929150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/c3nnjncjqoium

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #841

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/841/display/redirect?page=changes>

Changes:

[shaojwu] make identifier of Date&DateTime to be a public static field

[shaojwu] make identifier of Time to be a public static field

[noreply] set upper bound on google-cloud-profiler (#23354)

[noreply] Add ISSUE#23071 to CHANGES.md (#23297)

[noreply] Pin objsize version to avoid regression in 0.6.0 (#23396)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 91d79d973a327b6d22314c8e28bf1b93bc608c2b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 91d79d973a327b6d22314c8e28bf1b93bc608c2b # timeout=10
Commit message: "Pin objsize version to avoid regression in 0.6.0 (#23396)"
 > git rev-list --no-walk 5550be5196461647f2b3b2fd05e137474e1a60d4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins940320321618206725.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0928150426 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketException: Unexpected end of file from server' (2 retries remaining)...
Publishing build scan failed due to network error 'java.net.SocketException: Unexpected end of file from server' (1 retry remaining)...

A network error occurred.

If you require assistance with this problem, please report it via https://gradle.com/help/plugin and include the following information via copy/paste.

----------
Gradle version: 7.5.1
Plugin version: 3.4.1
Request URL: https://status.gradle.com
Request ID: 29015505-9ab1-4a19-be15-9cb907d89f18
Exception: java.net.SocketException: Unexpected end of file from server
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #840

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/840/display/redirect?page=changes>

Changes:

[ningkang0957] Upgraded Flink on Dataproc support from Interacitve Beam

[noreply] GA Migration PreCommit and PostCommit Tables in CI.md (#23372)

[noreply] Stack Trace Decoration for Beam Samza Runner (#23221)

[noreply] [#22478]: Add read_time support to Google Firestore connector (#22966)

[noreply] Changes CoGroupByKey typehint from List to Iterable (#22984)

[noreply] Fix TextSource incorrect handling in channels that return short reads.

[noreply] Add a tensorflow example to the run_inference_basic notebook (#23173)

[noreply] RunInference Benchmarks UI (#23371)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5550be5196461647f2b3b2fd05e137474e1a60d4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5550be5196461647f2b3b2fd05e137474e1a60d4 # timeout=10
Commit message: "RunInference Benchmarks UI (#23371)"
 > git rev-list --no-walk 3a4d57eb8976c5f503b32d478a80b1800490f66f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2036315104692917744.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0927150436 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mmckksknbooug

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #839

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/839/display/redirect?page=changes>

Changes:

[noreply] Bump Java FnApi Container version to beam-master-20220923 (#23352)

[noreply] Bump org.nosphere.apache.rat from 0.7.0 to 0.8.0 (#23330)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3a4d57eb8976c5f503b32d478a80b1800490f66f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3a4d57eb8976c5f503b32d478a80b1800490f66f # timeout=10
Commit message: "Bump org.nosphere.apache.rat from 0.7.0 to 0.8.0 (#23330)"
 > git rev-list --no-walk c4fe823b4f9e8fc4711478749efb35cd143bfce2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3524795689817390467.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0926150438 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ya7xmwaqr4lj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #838

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/838/display/redirect?page=changes>

Changes:

[noreply] Extract playground components (#23253)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c4fe823b4f9e8fc4711478749efb35cd143bfce2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c4fe823b4f9e8fc4711478749efb35cd143bfce2 # timeout=10
Commit message: "Extract playground components (#23253)"
 > git rev-list --no-walk 2f1f1a76419a032ea2d70671e3c6c9fe86b0626f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5676159119146266718.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0925150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3hiifjf2nhawy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #837

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/837/display/redirect?page=changes>

Changes:

[noreply] lint fixes to go (#23351)

[noreply] Bump cloud.google.com/go/bigquery from 1.41.0 to 1.42.0 in /sdks


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2f1f1a76419a032ea2d70671e3c6c9fe86b0626f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2f1f1a76419a032ea2d70671e3c6c9fe86b0626f # timeout=10
Commit message: "Bump cloud.google.com/go/bigquery from 1.41.0 to 1.42.0 in /sdks (#23329)"
 > git rev-list --no-walk 90739533a8c84a9354197cac435c85b4ba002344 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8932832964821614262.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0924150420 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/irzuvmwmzwlmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #836

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/836/display/redirect?page=changes>

Changes:

[Steve Niemitz] use avro DataFileReader to read avro container files

[noreply] Change google_cloud_bigdataoss_version to 2.2.8. (#23300)

[Moritz Mack] Fix Nexmark default log level

[noreply] Bump cloud.google.com/go/storage from 1.26.0 to 1.27.0 in /sdks (#23336)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 90739533a8c84a9354197cac435c85b4ba002344 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 90739533a8c84a9354197cac435c85b4ba002344 # timeout=10
Commit message: "Bump cloud.google.com/go/storage from 1.26.0 to 1.27.0 in /sdks (#23336)"
 > git rev-list --no-walk 762edd7f3a64f076dbee156fa48b8a7e5e6a512f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5669222698800813259.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0923150424 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/k3e62fz4konsu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #835

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/835/display/redirect?page=changes>

Changes:

[bvolpato] Do not use .get() on ValueProvider during pipeline creation

[noreply] [Java SDK core] emit watermark from PeriodicSequence (#23301) (#23302)

[noreply] Extend protocol in windmill.proto used by google-cloud-dataflow-java

[noreply] Allow longer Class-Path entries (#23269)

[noreply] Improved pipeline translation in SparkStructuredStreamingRunner (#22446)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 762edd7f3a64f076dbee156fa48b8a7e5e6a512f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 762edd7f3a64f076dbee156fa48b8a7e5e6a512f # timeout=10
Commit message: "Improved pipeline translation in SparkStructuredStreamingRunner (#22446)"
 > git rev-list --no-walk d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8719143650097081785.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0922150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fo5ixy5indbk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #834

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/834/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d578e3df7c963e57f251fb27739fbc1d3811e722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
Commit message: "[BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)"
 > git rev-list --no-walk d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1854098947061856879.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0921150420 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5embazaj62ufw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #833

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/833/display/redirect?page=changes>

Changes:

[Pablo Estrada] Revert "Trying out property-based tests for Beam python coders (#22233)"

[noreply] Bump google.golang.org/api from 0.95.0 to 0.96.0 in /sdks (#23246)

[noreply] [Go SDK] Add timer coder support (#23222)

[noreply] Fix wrong comment (#23272)

[noreply] [Playground] [Backend] Cache component for playground examples (#22869)

[noreply] [BEAM-13416] Introduce Schema provider for AWS models and deprecate low

[noreply] [BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d578e3df7c963e57f251fb27739fbc1d3811e722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
Commit message: "[BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)"
 > git rev-list --no-walk 5520fe064fc3b7196998d4597746119691eb6681 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6531923545032597377.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0920150434 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q3aadpbdhth4c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #832

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/832/display/redirect?page=changes>

Changes:

[noreply] Enable verbose output for RAT Precommit (#23279)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5520fe064fc3b7196998d4597746119691eb6681 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5520fe064fc3b7196998d4597746119691eb6681 # timeout=10
Commit message: "Enable verbose output for RAT Precommit (#23279)"
 > git rev-list --no-walk f477b85f230ebb5dbd6b62540da078a33e3318ce # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3850974803811121245.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0919150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lu7prop5mzhc2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #831

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/831/display/redirect?page=changes>

Changes:

[noreply] Add drop_example flag to the RunInference and Model Handler (#23266)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f477b85f230ebb5dbd6b62540da078a33e3318ce (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f477b85f230ebb5dbd6b62540da078a33e3318ce # timeout=10
Commit message: "Add drop_example flag to the RunInference and Model Handler (#23266)"
 > git rev-list --no-walk 8754cc0904872d37edbb8b4d3b8d9f92aad94acc # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7367883017291103710.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0918150402 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/captrhyz3ok7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #830

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/830/display/redirect?page=changes>

Changes:

[noreply] TensorRT Initial commit (#22131)

[noreply] Fix Kafka performance test sourceOption to match expected hash (#23274)

[noreply] updated the pydoc for running a custom model on Beam (#23218)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8754cc0904872d37edbb8b4d3b8d9f92aad94acc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8754cc0904872d37edbb8b4d3b8d9f92aad94acc # timeout=10
Commit message: "updated the pydoc for running a custom model on Beam (#23218)"
 > git rev-list --no-walk 8b2676782a62f8bdf912395267056c9f37251338 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2966784103559661964.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0917150412 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ruxk63ctg3h76

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #829

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/829/display/redirect?page=changes>

Changes:

[noreply] Revert "Exclude protobuf 3.20.2" (#23237)

[noreply] Fix outdated code in python sdk install (#23231)

[noreply] Bump up dataflow python container version to beam-master-20220914

[noreply] Improve the performance of TextSource by reducing how many byte[]s are

[noreply] Issue#21430 Avoid pruning DataframeTransforms (#23069)

[noreply] Bump cloud.google.com/go/bigquery from 1.40.0 to 1.41.0 in /sdks

[noreply] [Website] Correct spelling of structural (#23225)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8b2676782a62f8bdf912395267056c9f37251338 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8b2676782a62f8bdf912395267056c9f37251338 # timeout=10
Commit message: "[Website] Correct spelling of structural (#23225)"
 > git rev-list --no-walk 6911520a5165f26a6966a54dd369e07764e6334c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8720898012671886377.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0916150418 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v34x7qxiltkgs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #828

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/828/display/redirect?page=changes>

Changes:

[noreply] Fix assignees check

[noreply] Update cibuildwheel (#23024)

[noreply] Add section to docs on resource hints/RunInference (#23215)

[noreply] (BQ Python) Perform job waits in finish_bundle to allow BQ streaming

[noreply] Update to newest version of CloudPickle. (#23223)

[bulat.safiullin] [Website] update site navigation  #22902

[noreply] Resolve script parsing error when changing from bash to sh. (#23199)

[noreply] Bump cloud.google.com/go/bigquery from 1.39.0 to 1.40.0 in /sdks

[noreply] Bump github.com/google/go-cmp from 0.5.8 to 0.5.9 in /sdks (#23123)

[noreply] Update google-cloud-bigquery requirement from <3,>=1.6.0 to >=1.6.0,<4

[noreply] Optimize varint reading and writing for small ints. (#23192)

[noreply] Pass namespace through RunInference transform (#23182)

[noreply] [GitHub Actions] - INFRA scripts to implement GCP Self-hosted runners

[noreply] GA migration - Base actions to use for precommit and postcommit

[noreply] Test fix Kafka Performance test batch (#23191)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6911520a5165f26a6966a54dd369e07764e6334c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6911520a5165f26a6966a54dd369e07764e6334c # timeout=10
Commit message: "Test fix Kafka Performance test batch (#23191)"
 > git rev-list --no-walk 66bbee84ed477d86008905646e68b100591b6f78 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3630290691538011520.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0915150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tclpmcyyvwcho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #827

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/827/display/redirect?page=changes>

Changes:

[noreply] Open Allow and test pyarrow 8.x and 9.x (#22997)

[noreply] (BQ Python) Pass project field from options or parameter when writing

[noreply] Update python-machine-learning.md (#23209)

[noreply] Pin the version of cloudpickle to 2.1.x (#23120)

[noreply] Add streaming test for Write API sink (#21903)

[noreply] [Go SDK] Proto changes for timer param (#23216)

[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#23201)

[noreply] Update to objsize to 0.5.2 which is under BSD-3 license (fixes #23096)

[noreply] Exclude insignificant whitespace from cloud object (#23217)

[noreply] Trying out property-based tests for Beam python coders (#22233)

[noreply] Publish results of JMH benchmark runs (Java SDK) to InfluxDB (part of

[noreply] Exclude protobuf 3.20.2 (#23226)

[noreply] Fix IllegalStateException in StorageApiWriteUnshardedRecords error


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 66bbee84ed477d86008905646e68b100591b6f78 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 66bbee84ed477d86008905646e68b100591b6f78 # timeout=10
Commit message: "Fix IllegalStateException in StorageApiWriteUnshardedRecords error handling. (#23205)"
 > git rev-list --no-walk c654e41cb40acad026a2a4665383b60c0227f694 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins528259775677710053.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0914150957 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mssb4hdbbwyak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #826

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/826/display/redirect?page=changes>

Changes:

[noreply] Bump dataflow java fnapi container version to beam-master-20220830

[noreply] [Issue#23071] Fix AfterProcessingTime for Python to behave like Java

[noreply] Don't depend on java 11 docker container for go test (#23197)

[Moritz Mack] Annotate stateful VR test in TestStreamTest with UsesStatefulParDo

[Moritz Mack] Properly close Spark (streaming) context if Pipeline translation fails

[noreply] [Playground] [Backend] Datastore queries and mappers to get precompiled


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c654e41cb40acad026a2a4665383b60c0227f694 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c654e41cb40acad026a2a4665383b60c0227f694 # timeout=10
Commit message: "[Playground] [Backend] Datastore queries and mappers to get precompiled objects (#22868)"
 > git rev-list --no-walk 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6737678629944130951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0913150423 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2ury7jc46nrpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #825

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/825/display/redirect?page=changes>

Changes:

[noreply] [TPC-DS] Use common queries argument for Jenkins jobs (#23139)

[noreply] pubsublite: Reduce commit logspam (#22762)

[noreply] Added documentation in ACTIONS.md file (#23159)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 # timeout=10
Commit message: "Added documentation in ACTIONS.md file (#23159)"
 > git rev-list --no-walk 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4858410426911906550.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0912150428 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w2zze3npaz4ku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #824

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/824/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1526ca8c4cc6d58b3c28d816fc2597e51603d75f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
Commit message: "Improvements to SchemaTransform implementations for BQ and Kafka (#23045)"
 > git rev-list --no-walk 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7160520429474833951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0911150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/47dichaeys4ri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #823

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/823/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update shortcode languages from duplicate go to typescript

[cushon] Use a ClassLoadingStrategy that is compatible with Java 17+

[noreply] [Website] update case-studies logo images #22799 (#22793)

[noreply] [Website] change media-query max-width variable to ak-breakpoint-xl

[noreply] [Website] add overflow to code tags #22888 (#22427)

[noreply] Clean up Kafka Cluster and pubsub topic in rc validation script (#23021)

[noreply] Fix assertions in the Spanner IO IT tests (#23098)

[noreply] Use existing pickle_library flag in expansion service. (#23111)

[noreply] Assert pipeline results in performance tests (#23027)

[noreply] Consolidate Samza TranslationContext and PortableTranslationContext

[noreply] Improvements to SchemaTransform implementations for BQ and Kafka


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1526ca8c4cc6d58b3c28d816fc2597e51603d75f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
Commit message: "Improvements to SchemaTransform implementations for BQ and Kafka (#23045)"
 > git rev-list --no-walk 5734d3e3af68a22aa5a893d3cb9b138990b22911 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2353090329647728027.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0910150413 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zora55qdjxpkm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #822

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/822/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add paddings to pillars-item, change styles of footer logos

[bulat.safiullin] [Website] add table-container-wrapper #22896

[yathu] Decrease derby.locks.waitTimeout in jdbc unit test

[noreply] Auto-cancel old unit test Actions Runs (#23095)

[noreply] Merge pull request #23092 Cross-language tests in github actions.

[noreply] Update CHANGES.md for 2.42.0 cut, and add 2.43.0 section (#23108)

[noreply] remove `"io/ioutil"` package (#23001)

[noreply] Add one NER example to use a spaCy model with RunInference (#23035)

[noreply] Bump google.golang.org/api from 0.94.0 to 0.95.0 in /sdks (#23062)

[noreply] Implement JsonUtils (#22771)

[noreply] Support models returning a dictionary of outputs (#23087)

[noreply] [TPC-DS] Store metrics into BigQuery and InfluxDB (#22545)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5734d3e3af68a22aa5a893d3cb9b138990b22911 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5734d3e3af68a22aa5a893d3cb9b138990b22911 # timeout=10
Commit message: "Merge pull request #2281: [Website] update homepage mobile styles"
 > git rev-list --no-walk 9efa3787aefe9198c7985dd30b16691cdba61a7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6150213836930741780.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0909150426 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yww32ylwpjkfw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #821

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/821/display/redirect?page=changes>

Changes:

[clementg] allow non-lts jvm version, fallback on java 11 for runner

[clementg] Add a stricter java version method

[clementg] fall back to the nearest lts version

[noreply] Keep stale action from closing issues (#23067)

[Robert Bradshaw] Use cloudpickle for Java Python transforms.

[noreply] Merge pull request #22996: [BEAM-11205] Update GCP Libraries BOM

[Robert Burke] Moving to 2.43.0-SNAPSHOT on master branch.

[noreply] clean up comments and register functional DoFn in wordcount.go (#23057)

[noreply] [Tour Of Beam][backend] integration tests and GA workflow (#23032)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9efa3787aefe9198c7985dd30b16691cdba61a7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9efa3787aefe9198c7985dd30b16691cdba61a7e # timeout=10
Commit message: "[Tour Of Beam][backend] integration tests and GA workflow (#23032)"
 > git rev-list --no-walk 0d937d4cd725965572d4720811fa2d6efaa8edf8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8557624825898245103.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0908150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q32gufpst2rqg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #820

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/820/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Cosmetic checkstyle fix to TextRowCountEstimator

[Kenneth Knowles] Upgrade to Gradle 7.5.1

[Brian Hulette] Use typehints in benchmark utilities

[oleg.borisevich] fixing condition for db index creation

[Robert Bradshaw] Allow expansion service to choose pickler.

[noreply] Disable singleIterate (#23042)

[Robert Bradshaw] Accept "default" as pickler library.

[Robert Bradshaw] Clarifying comment.

[Heejong Lee] [BEAM-22856] PythonService Beam version compatibility

[chamikaramj] Fixes RunInference test failure

[noreply] Bump github.com/lib/pq from 1.10.6 to 1.10.7 in /sdks (#23061)

[noreply] Allowing more flexible precision for TIMESTAMP, DATETIME fields in

[noreply] Reenable run-inference tests on windows (#23044)

[noreply] [BEAM-12164] Support new value capture types NEW_ROW NEW_VALUES for s…

[noreply] Fix example registration input arity (#23059)

[noreply] Clarify inference example docs (#23018)

[noreply] [Playground] [Backend] Datastore queries and mappers to get examples


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0d937d4cd725965572d4720811fa2d6efaa8edf8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0d937d4cd725965572d4720811fa2d6efaa8edf8 # timeout=10
Commit message: "[Playground] [Backend] Datastore queries and mappers to get examples (#22955)"
 > git rev-list --no-walk ca9ee909e57e36f0027001f1c101852378105490 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins187943057147427115.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0907150402 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xbacdsf6xeele

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #819

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/819/display/redirect?page=changes>

Changes:

[noreply] Revert "Remove subprocess.PIPE usage by using a temp file (#22654)"

[noreply] Allow users to pass classloader to dynamically load JDBC drivers.

[noreply] Fix withCheckStopReadingFn to not cause the pipeline to crash (#22962)

[noreply] Inference benchmark tests (#21738)

[noreply] [Go SDK]: Add support for Google Cloud Profiler for pipelines (#22824)

[noreply] Listen to window messages to switch SDK and to load content (#22959)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ca9ee909e57e36f0027001f1c101852378105490 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ca9ee909e57e36f0027001f1c101852378105490 # timeout=10
Commit message: "Listen to window messages to switch SDK and to load content (#22959)"
 > git rev-list --no-walk 3c91e7b24a53a6a5b929ede58231bbc57c9ddced # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7085751590207907133.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0906150450 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q53xsqqecp4uo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #818

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/818/display/redirect?page=changes>

Changes:

[noreply] Generalize interface of InfluxDBPublisher to support more use cases


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3c91e7b24a53a6a5b929ede58231bbc57c9ddced (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3c91e7b24a53a6a5b929ede58231bbc57c9ddced # timeout=10
Commit message: "Generalize interface of InfluxDBPublisher to support more use cases (#22238) (#22260)"
 > git rev-list --no-walk 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins81936991881343306.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0905150358 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cxehwsovm7zfq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #817

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/817/display/redirect?page=changes>

Changes:

[noreply] [#19857] Migrate to using a memory aware cache within the Python SDK


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 # timeout=10
Commit message: "[#19857] Migrate to using a memory aware cache within the Python SDK harness (#22924)"
 > git rev-list --no-walk 31561e2ff13147aa80f9f811e2a94ebe57b25374 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5062128391965480229.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0904150405 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rfvjfvxuxj7fq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #816

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/816/display/redirect?page=changes>

Changes:

[noreply] [Tour of Beam]: Welcome Screen frontend layout (#22794)

[noreply] Remove redundant testEventTimeTimerSetWithinAllowedLateness sickbay

[noreply] Adding support for Beam Schema Rows with BQ DIRECT_READ (#22926)

[noreply] Add java Bigquery IO known issue to beam 2.40 release blogpost (#22611)

[noreply] Update playground_deploy_examples.yml

[noreply] Add run-inference component for autolabeling (#22971)

[noreply] [Playground] [Infrastructure] Deleting the Cloud Storage Client (#22722)

[noreply] Updates Java RunInference to infer Python dependencies when possible

[noreply] Adding TensorFlow support to the Machine Learning overview page (#22949)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 31561e2ff13147aa80f9f811e2a94ebe57b25374 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 31561e2ff13147aa80f9f811e2a94ebe57b25374 # timeout=10
Commit message: "Adding TensorFlow support to the Machine Learning overview page (#22949)"
 > git rev-list --no-walk 4b46ef40289ddf33aac1aac0ca6741d96407bd3b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5349387671859956702.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0903150348 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4xgkjoh3xej3u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #815

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/815/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Update proto generation script due to BEAM-13939.

[Robert Bradshaw] Regenerate typescript protos.

[noreply] Add initial read_gbq wrapper (#22616)

[noreply] Minor: Fix lint failure (#22998)

[noreply] [Tour Of Beam][backend] get unit content (#22967)

[noreply] Allows to use databaseio with postgres driver (#22941)

[noreply] Bump cloud.google.com/go/storage from 1.25.0 to 1.26.0 in /sdks (#22954)

[noreply] [BEAM-22859] Allow the specification of extra packages for external


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b46ef40289ddf33aac1aac0ca6741d96407bd3b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b46ef40289ddf33aac1aac0ca6741d96407bd3b # timeout=10
Commit message: "[BEAM-22859] Allow the specification of extra packages for external Python transforms. (#22991)"
 > git rev-list --no-walk 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1441962382760042168.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0902150453 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rmf3s3skyk7s4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #814

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/814/display/redirect?page=changes>

Changes:

[Brian Hulette] Extract utilities in dataframe.schemas

[Brian Hulette] Add pandas_type_compatibility with pandas BatchConverter implementations

[Brian Hulette] Use Batched DoFns at DataFrame API boundaries

[Brian Hulette] Move dtype conversion to pandas_type_compatibility

[Brian Hulette] Always register pandas BatchConverters

[Brian Hulette] Fix interactive runner tests

[Brian Hulette] Use pandas_type_compatibility BatchConverters for dataframe.schemas

[Brian Hulette] Skip test cases broken in pandas 1.1.x

[Brian Hulette] Address review comments

[Brian Hulette] yapf, typo in test

[noreply] Add ability to remove/clear map and set state (#22938)

[Brian Hulette] Add test to reproduce https://github.com/apache/beam/issues/22854

[Brian Hulette] Exercise row coder with nested optional struct

[Brian Hulette] Make RowTypeConstraint callable

[Brian Hulette] Add test to exercise RowTypeConstraint.__call__

[noreply] Fix gpu to cpu conversion with warning logs (#22795)

[noreply] Add Go stateful DoFns to CHANGES.md and fix linting violations (#22958)

[noreply] 22805: Upgrade Jackson version from 2.13.0 to 2.13.3 (#22806)

[noreply] Run cred rotation every month (#22977)

[noreply] [BEAM-12164] Synchronize access queue in ThroughputEstimator and

[noreply] Add some explanatory comments to the wordcount registration (#22989)

[noreply] Move Go examples under the cookbook directory to generic registration

[noreply] Improve BQ test utils to support JSON in a more simple manner (#22942)

[noreply] [fixes #22980] Migrate BeamFnLoggingClient to the new execution state


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 # timeout=10
Commit message: "[fixes #22980] Migrate BeamFnLoggingClient to the new execution state sampler. (#22981)"
 > git rev-list --no-walk d615b624e9ff211e857d026d541c4d56fd18e2d3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9070524098383077627.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0901150439 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 35s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ibpi4e46kwurs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #813

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/813/display/redirect?page=changes>

Changes:

[noreply] Minor: Fix option_from_runner_api typehint (#22946)

[noreply] Filter out unsupported state tests (#22963)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d615b624e9ff211e857d026d541c4d56fd18e2d3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d615b624e9ff211e857d026d541c4d56fd18e2d3 # timeout=10
Commit message: "Filter out unsupported state tests (#22963)"
 > git rev-list --no-walk 3ede5b76e48b41e89bc67541ea5044ebe704e905 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6572178010655779109.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0831150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rcujv66224uz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #812

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/812/display/redirect?page=changes>

Changes:

[yathu] Support Timestamp type in xlang JDBC Read and Write

[yathu] change urn name to millis_instant:v1

[yathu] Add standard_coders test

[yathu] Apply suggestions from code review

[yathu] Fix Java standard coder test

[yathu] Fix logical type with same language type gets completely hidden

[Robert Bradshaw] [BEAM-22923] Allow sharding specification for dataframe writes.

[noreply] [Playground] Update build_playground_backend.yml - add "Index creation"

[noreply] [Playground] [Backend] added SDK validation to save a code snippet

[noreply] Fix linting violations (#22934)

[noreply] [akvelon][tour-of-beam] backend bootstraps (#22556)

[noreply] Bump up postcommit timeout (#22937)

[noreply] Handle stateful windows correctly + integration test (#22918)

[noreply] Automatically infer state keys from their field name (#22922)

[noreply] Updates to multi-lang Java quickstart (#22927)

[noreply] Fix yaml duplicated mapping key (#22952)

[noreply] [Playground] [Infrastructure] Adding the Cloud Datastore client to save

[noreply] Fix jdbc date conversion offset 1 day (#22738)

[noreply] Set state integration test (#22935)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3ede5b76e48b41e89bc67541ea5044ebe704e905 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3ede5b76e48b41e89bc67541ea5044ebe704e905 # timeout=10
Commit message: "Set state integration test (#22935)"
 > git rev-list --no-walk 90baef11b6862e9f698df7ea888fe21dc69513e6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8585951424587115679.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0830191607 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/t3c5ebj4e7mqq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #811

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/811/display/redirect?page=changes>

Changes:

[noreply] Add set state in Go (#22919)

[noreply] Go Map State integration test (#22898)

[noreply] Add clear function for bag state types (#22917)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 90baef11b6862e9f698df7ea888fe21dc69513e6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 90baef11b6862e9f698df7ea888fe21dc69513e6 # timeout=10
Commit message: "Add clear function for bag state types (#22917)"
 > git rev-list --no-walk e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6857237950390374827.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0829150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ws3taovbd5czg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #810

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/810/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e9089dd99630d939f0c38fbacbe97a283e429fc2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
Commit message: "[BEAM-12164] Feat: Added support to Cloud Spanner Change Streams connector for including transaction tags in the Change Stream records (#22769)"
 > git rev-list --no-walk e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1618376271091599908.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0828150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/an4ptijbt7ue4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #809

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/809/display/redirect?page=changes>

Changes:

[noreply] Pass user specified destination type to UpdateSchemaDestination 

[noreply] [Go SDK] Stream decode values in single iterations (#22904)

[noreply] Enable autosharding for BQ: #22818

[noreply] Update wordcount_minimal.py by removing pipeline_args.extend (#22786)

[noreply] Add map state in the Go Sdk (#22897)

[noreply] [BEAM-12164] Feat: Added support to Cloud Spanner Change Streams


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e9089dd99630d939f0c38fbacbe97a283e429fc2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
Commit message: "[BEAM-12164] Feat: Added support to Cloud Spanner Change Streams connector for including transaction tags in the Change Stream records (#22769)"
 > git rev-list --no-walk 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6360238782817446115.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0827150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mbyx6p4bygkmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #808

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/808/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-22723] Yield BatchElement batches at end of window.

[noreply] Update sdks/python/apache_beam/transforms/util_test.py

[noreply] [Website] add Python to KinesisIO in connectors #22845 (#22841)

[noreply] Combining state integration test (#22846)

[cushon] Update to Byte Buddy 1.12.14

[cushon] Add a regression test

[cushon] Add spotless exclusion

[noreply] Small lint fixes (#22890)

[noreply] Preserve state on SDK switch (#22430) (#22735)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa # timeout=10
Commit message: "Merge pull request #22814 from cushon/bb"
 > git rev-list --no-walk 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8359479759866327575.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0826150401 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kmqbt7hooxaoc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #807

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/807/display/redirect?page=changes>

Changes:

[chamikaramj] Updates old releases to use archive.apache.org

[noreply] Fix a few linting issues (#22842)

[noreply] Add combining state support (#22826)

[noreply] Bump cloud.google.com/go/pubsub from 1.24.0 to 1.25.1 in /sdks (#22850)

[noreply] Bump google.golang.org/grpc from 1.48.0 to 1.49.0 in /sdks (#22838)

[noreply] [Website] update videos section (#22772)

[noreply] Update Dataflow fnapi_container-version (#22852)

[noreply] Go SDK Katas: Update beam module dependency (#22753)

[noreply] unskip sklearn IT test (#22825)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 # timeout=10
Commit message: "unskip sklearn IT test (#22825)"
 > git rev-list --no-walk 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7689278238522169569.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0825150441 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fqazjfu3qftoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #806

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/806/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Eliminate some null errors and rawtypes from sdks/java/core

[Kiley Sok] Update Beam 2.41.0 release docs

[noreply] [Playground] Setup Datastore in Playground project using Terraform -

[noreply] Add bag state support (#22816)

[Kiley Sok] Fix dates for 2.41.0 release

[noreply] added link to setup instructions in WordCount example (#22832)

[noreply] Bump google.golang.org/api from 0.93.0 to 0.94.0 in /sdks (#22839)

[noreply] Bump cloud.google.com/go/bigquery from 1.38.0 to 1.39.0 in /sdks

[noreply] Add an integration test for bag state (#22827)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad # timeout=10
Commit message: "Add an integration test for bag state (#22827)"
 > git rev-list --no-walk c7938faea948403ed33336cc99a6ae2afa9f5c32 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4212287221470250831.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0824150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rjl2ywiy2j6lw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #805

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/805/display/redirect?page=changes>

Changes:

[yathu] Evaluate proper metric in TextIOIT

[Andrew Pilloud] Add Python nexmark to gradle

[Michael Luckey] Align neo4j error messages with API

[noreply] E2E basic state support (#22798)

[noreply] Add state integration test (#22815)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c7938faea948403ed33336cc99a6ae2afa9f5c32 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c7938faea948403ed33336cc99a6ae2afa9f5c32 # timeout=10
Commit message: "Merge pull request #22740: Evaluate proper metric in TextIOIT"
 > git rev-list --no-walk dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1409649442438083384.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0823150415 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jyzhwzdbrddk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #804

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/804/display/redirect?page=changes>

Changes:

[noreply] Bump cloud.google.com/go/bigquery from 1.37.0 to 1.38.0 in /sdks

[noreply] Add Release category to release announcement blogs (#22785)

[noreply] [BEAM-13657] Update Python version used by mypy. (#22804)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 # timeout=10
Commit message: "[BEAM-13657] Update Python version used by mypy. (#22804)"
 > git rev-list --no-walk f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8111814919932914703.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0822153953 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xixpwid4jntdi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #803

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/803/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f921a2f1996cf906d994a9d62aeb6978bab09dd5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
Commit message: "Fix lint issues (#22800)"
 > git rev-list --no-walk f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8593965944380473509.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0821150351 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/plhiqo2s3jh7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #802

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/802/display/redirect?page=changes>

Changes:

[noreply] Modify RunInference to return PipelineResult for the benchmark tests

[noreply] Fix lint issues (#22800)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f921a2f1996cf906d994a9d62aeb6978bab09dd5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
Commit message: "Fix lint issues (#22800)"
 > git rev-list --no-walk 7a469fd20ef198a38e1df6af081062904dd1cbbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8398984364719382607.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0820150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rmdvd7ylgqiis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #801

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/801/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add scroll to new position if anchor is present #22699

[randomstep] [BEAM-8701] bump commons-io to 2.7

[bulat.safiullin] [Website] remove text from Available contact channels table #22696

[bulat.safiullin] [Website] update commits link #22520

[cushon] Downgrade bytebuddy version to 1.11.0

[noreply] fixed column width in tables in Getting started from Spark guide

[noreply] Testing authentication for Playground (#22782)

[noreply] [BEAM-12776, fixes #21095] Limit parallel closes from the prior element

[noreply] [BEAM-13015, #21250] Reuse buffers when possible when writing on

[noreply] [Go SDK] Fix go lint errors (#22796)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7a469fd20ef198a38e1df6af081062904dd1cbbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7a469fd20ef198a38e1df6af081062904dd1cbbb # timeout=10
Commit message: "Merge pull request #22433: [BEAM-8701] bump commons-io to 2.7"
 > git rev-list --no-walk 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2976450536618237898.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0819150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins
> Task :buildSrc:check
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 25s
10 actionable tasks: 8 executed, 1 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/3jxu5k63643bu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #800

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/800/display/redirect?page=changes>

Changes:

[noreply] Fix direct running mode multi_processing on win32 (#22730)

[noreply] Improve error message on schema issues (#22469)

[noreply] sklearn runinference regression example (#22088)

[noreply] [Website] add intuit case-study, add intuit quote-card (#22757)

[noreply] Avoid panic on type assert. (#22767)

[noreply] [#21935] Reject ill formed GroupByKey coders during pipeline.run

[noreply] Don't use batch interface for single object operations (#22432)

[noreply] Label kata changes with the language they're modifying (#22764)

[noreply] [Website] Add GitHub issue link (#22774)

[noreply] Fix some typos in the ML doc (#22763)

[noreply] Go stateful DoFns user side changes (#22761)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 # timeout=10
Commit message: "Go stateful DoFns user side changes (#22761)"
 > git rev-list --no-walk 60581e8b1b6e93889cce78542e99d1fea4105d54 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6404192359750385858.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0818150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2i2rtefbvtay6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #799

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/799/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015, #21250] Remove looking up thread local metrics container

[noreply] [fixes #22731] Publish nightly snapshot of legacy Dataflow worker jar.

[andyye333] Remove assert

[noreply] [fixes #22744] Update hadoop library patch versions to 2.10.2 and 3.2.4

[noreply] Update beam-master version for legacy (#22741)

[noreply] Bump google.golang.org/api from 0.92.0 to 0.93.0 in /sdks (#22752)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 60581e8b1b6e93889cce78542e99d1fea4105d54 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 60581e8b1b6e93889cce78542e99d1fea4105d54 # timeout=10
Commit message: "Bump google.golang.org/api from 0.92.0 to 0.93.0 in /sdks (#22752)"
 > git rev-list --no-walk 91c4b87aa95d89aac806ef374fda63637960bd6c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1328754146291886658.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0817150423 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rsjrnw5q6g47o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #798

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/798/display/redirect?page=changes>

Changes:

[Steve Niemitz] Fix UpdateSchemaDestination when source format is set to AVRO

[noreply] Add a dataflow override for runnerv1 to still use SDF on runnerv2.

[noreply] [Playground] Result filter bug (#22215)

[noreply] [Website] update case-studies layout (#22342)

[noreply] Implement KafkaSchemaTransformReadConfiguration (#22403)

[noreply] Handle single-precision float values in the standard coders tests


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 91c4b87aa95d89aac806ef374fda63637960bd6c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 91c4b87aa95d89aac806ef374fda63637960bd6c # timeout=10
Commit message: "Handle single-precision float values in the standard coders tests properly (#22716)"
 > git rev-list --no-walk 21584b132d23a30c60ec6d8da65f60b525cfd768 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9211931976431665544.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0816150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/i6h2dtybttuho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #797

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/797/display/redirect?page=changes>

Changes:

[noreply] fix minor unreachable code caused by log.Fatal (#22618)

[noreply] Attempt to fix SpannerIO test flakes (#22688)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 21584b132d23a30c60ec6d8da65f60b525cfd768 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 21584b132d23a30c60ec6d8da65f60b525cfd768 # timeout=10
Commit message: "Attempt to fix SpannerIO test flakes (#22688)"
 > git rev-list --no-walk 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4069278984046472960.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0815150401 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j52mnxtjecnjo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #796

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/796/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 184d8c59b34a70dac116517ac2791aeefa918bbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
Commit message: "Bump up python container versions (#22697)"
 > git rev-list --no-walk 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins557870123585846007.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0814150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h3k2sx524lcry

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #795

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/795/display/redirect?page=changes>

Changes:

[noreply] [Playground] [Backend] added validation for snippet endpoints to avoid

[noreply] Add GeneratedClassRowTypeConstraint (#22679)

[noreply] [Playground] [Backend] Removing unused snippets manually and using the

[noreply] Implement PubsubSchemaTransformWriteConfiguration (#22262)

[noreply] Add support for FLOAT to Python RowCoder (#22626)

[noreply] Bump up python container versions (#22697)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 184d8c59b34a70dac116517ac2791aeefa918bbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
Commit message: "Bump up python container versions (#22697)"
 > git rev-list --no-walk 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7945912937181671329.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0813150412 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hm35nkrcbee24

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #794

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/794/display/redirect?page=changes>

Changes:

[yathu] Bump mongo_java_driver to 3.12.11 and embed.mongo to 3.0.0

[noreply] Fix seed job (#22687)

[noreply] Bump actions/stale from 3 to 5 (#22684)

[noreply] Bump actions/upload-artifact from 2 to 3 (#22682)

[noreply] Bump actions/download-artifact from 2 to 3 (#22683)

[noreply] Add shunts for Beam typehints (#22680)

[noreply] Fix wordcount setup-java (#22700)

[noreply] Bump google.golang.org/api from 0.91.0 to 0.92.0 in /sdks (#22681)

[bulat.safiullin] [Website] add container with overflow-x to runners with table #22708

[noreply] Bump cloud.google.com/go/storage from 1.24.0 to 1.25.0 in /sdks (#22705)

[noreply] [Go SDK]: Implement standalone single-precision float encoder (#22664)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 # timeout=10
Commit message: "[Go SDK]: Implement standalone single-precision float encoder (#22664)"
 > git rev-list --no-walk cf9ea1f442636f781b9f449e953016bb39622781 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7902994283274065451.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0812150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 3s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lddqxzodplv3g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #793

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/793/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update contribution content collapse

[noreply] Clean up checkstyle suppressions.xml (#22649)

[noreply] [Playground] [Infrastructure] format python code style (#22291)

[noreply] Minor: Add helpful names for parameterized dataframe.schemas_test

[noreply] [BEAM-14118, #21639] Use vendored gRPC 1.48.1 (#22628)

[Ismaël Mejía] Fix #22466 Add github actions dependency updates with dependabot

[noreply] Change Python PostCommits timeout (#22655)

[noreply] Revert "Persist ghprbPullId parameter in seed job (#22579)" (#22656)

[noreply] Bump actions/setup-java from 2 to 3 (#22666)

[noreply] Bump actions/labeler from 3 to 4 (#22670)

[noreply] Bump actions/setup-node from 2 to 3 (#22671)

[noreply] Bump actions/setup-go from 2 to 3 (#22669)

[noreply] Bump actions/setup-python from 2 to 4 (#22668)

[noreply] Bump actions/checkout from 2 to 3 (#22667)

[noreply] Fix broken link to Retry Policy blog (#22554)

[noreply] Include total in header of issue report (#22475)

[chamikaramj] Update vendored gRPC version for SpannerTransformRegistrarTest

[noreply] [Playground] Share any code feature frontend (#22477)

[noreply] Remove subprocess.PIPE usage by using a temp file (#22654)

[noreply] [#22647] Upgrade org.apache.samza to 1.6 (#22648)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cf9ea1f442636f781b9f449e953016bb39622781 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cf9ea1f442636f781b9f449e953016bb39622781 # timeout=10
Commit message: "[#22647] Upgrade org.apache.samza to 1.6 (#22648)"
 > git rev-list --no-walk fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1028733339240793798.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0811153615 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gkbdmj4f3pi46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #792

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/792/display/redirect?page=changes>

Changes:

[108862444+oborysevych] removed VladMatyunin from beam collaborators

[anandinguva98] Add stdlib distutils while building the wheels

[noreply] Skip

[noreply] Persist ghprbPullId parameter in seed job (#22579)

[noreply] Adhoc: Fix logging in Spark runner to avoid unnecessary creation of

[noreply] Improve exception when requested error tag does not exist (#22401)

[noreply] Reimplement Pub/Sub Lite's I/O using UnboundedSource. (#22612)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 # timeout=10
Commit message: "Reimplement Pub/Sub Lite's I/O using UnboundedSource. (#22612)"
 > git rev-list --no-walk d07bd6d2d7efe0b1da11b682b1fd88990186762d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7089110608641014132.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0809171004 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vxsc2rz3xwico

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #791

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/791/display/redirect?page=changes>

Changes:

[alexey.inkin] Fix retaining unsaved pipeline options (#22075)

[vlad.matyunin] modifed WithKeys Playground Example

[alexander.zhuravlev] [Playground] Removed banner from Playground header, deleted unused

[shivam] Add example for `Distinct` PTransform

[manitgupta] Fix bug in StructUtils

[noreply] [Playground][Backend][Bug]: Moving the initialization of properties file

[noreply] Bump cloud.google.com/go/bigquery from 1.36.0 to 1.37.0 in /sdks

[noreply] Minor: Clean up an assertion in schemas_test (#22613)

[noreply] Exclude testWithShardedKeyInGlobalWindow on streaming runner v1 (#22593)

[noreply] Pub/Sub Schema Transform Read Provider (#22145)

[noreply] Update BigQuery URI validation to allow more valid URIs through (#22452)

[noreply] Add units tests for SpannerIO (#22428)

[noreply] Bump google.golang.org/api from 0.90.0 to 0.91.0 in /sdks (#22568)

[noreply] Fix for #22631 KafkaIO considers readCommitted() as it would commit back

[noreply] [CdapIO] Add CdapIO dashboard in Grafana (#22641)

[noreply] Add information on how to take/close issues in the contribution guide.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d07bd6d2d7efe0b1da11b682b1fd88990186762d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d07bd6d2d7efe0b1da11b682b1fd88990186762d # timeout=10
Commit message: "Add information on how to take/close issues in the contribution guide. (#22640)"
 > git rev-list --no-walk 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3040995601504528755.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0809150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ztofmquypjlne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #790

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/790/display/redirect?page=changes>

Changes:

[noreply] Add PyDoc buttons to the top and bottom of the Machine Learning page


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea # timeout=10
Commit message: "Add PyDoc buttons to the top and bottom of the Machine Learning page (#22458)"
 > git rev-list --no-walk 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7451530249374269868.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0808153220 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kwcfiijfvrybi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #789

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/789/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 17fb9c0342064cd4375b0d7f2c37e12a175d03ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
Commit message: "Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)"
 > git rev-list --no-walk 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3431325426464779527.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0807150356 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dzyo6gatv3npm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #788

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/788/display/redirect?page=changes>

Changes:

[yathu] Moving misplaced CHANGES from template to 2.41.0

[noreply] Add Import transform to Go FhirIO (#22460)

[noreply] Allow unsafe triggers for python nexmark benchmarks (#22596)

[noreply] pubsublite: Fix max offset for computing backlog (#22585)

[noreply] Add support when writing to locked buckets by handling

[noreply] [BEAM-14118, #21639] Vendor gRPC 1.48.1 (#22607)

[noreply] [21894] Validates inference_args early (#22282)

[noreply] Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 17fb9c0342064cd4375b0d7f2c37e12a175d03ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
Commit message: "Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)"
 > git rev-list --no-walk 6910d770b76d14558da4fee27b66601b4989440e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9126683278298291924.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0806150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/trt5ai22nsxp2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #787

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/787/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #22347: [22188]Set allowed timestamp skew

[noreply] Added experimental annotation to fixes #22564 (#22565)

[noreply] [BEAM-14117] Delete vendored bytebuddy gradle build (#22594)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6910d770b76d14558da4fee27b66601b4989440e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6910d770b76d14558da4fee27b66601b4989440e # timeout=10
Commit message: "[BEAM-14117] Delete vendored bytebuddy gradle build (#22594)"
 > git rev-list --no-walk 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2124180027231551407.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0805150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 24s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/krttbt5efqcwi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #786

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/786/display/redirect?page=changes>

Changes:

[noreply] Update run_inference_basic.ipynb

[noreply] Update CHANGE.md after 2.41.0 cut (#22577)

[noreply] Convert to BeamSchema type from ReadfromBQ (#17159)

[noreply] Fix deleteTimer in InMemoryTimerInternals and enable VR tests for

[noreply] Update Dataflow container version (#22580)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 # timeout=10
Commit message: "Update Dataflow container version (#22580)"
 > git rev-list --no-walk bf39489b2a1fd45e6798483d083e4ad240f66891 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1464535940740890707.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0804150409 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/x5yuasko43vvy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #785

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/785/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] add zstd compression support according to issue 22393

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Remove normalization in Pytorch Image Segmentation example (#22371)

[chamikaramj] Mention Java RunInference support in the Website

[noreply] Downgrade less informative logs during write to files (#22273)

[noreply] Beam ml notebooks (#22510)

[noreply] Add clearer error message for xlang transforms on teh Go Direct Runner

[noreply] [CdapIO] Add integration tests for CdapIO (Batch) (#22313)

[noreply] Bugfix: Fix broken assertion in PipelineTest (#22485)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bf39489b2a1fd45e6798483d083e4ad240f66891 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bf39489b2a1fd45e6798483d083e4ad240f66891 # timeout=10
Commit message: "Merge pull request #22557: Mention Java RunInference support in the Website"
 > git rev-list --no-walk 48513adc665c32b32f50ff123bb18b66ca302934 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7224794548040047544.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0803150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/etrsotyrw7fha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #784

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/784/display/redirect?page=changes>

Changes:

[noreply] Exclude grpcio==1.48.0 (#22539)

[noreply] Merge PR #22304 fixing #22331 fixing JDBC IO IT

[noreply] Update pytest to support Python 3.10 (#22055)

[noreply] Update the imprecise link. (#22549)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 48513adc665c32b32f50ff123bb18b66ca302934 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 48513adc665c32b32f50ff123bb18b66ca302934 # timeout=10
Commit message: "Update the imprecise link. (#22549)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1570776885633175696.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0802152014 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/epdwcvkxxn44k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #783

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/783/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8072414429117372895.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0801150421 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2n74wkohqhy5g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #782

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/782/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7656613693184781691.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0731150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/o5q37vtyhm5fy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #781

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/781/display/redirect?page=changes>

Changes:

[noreply] Change _build import from setuptools to distutils (#22503)

[noreply] Remove stringx package (#22534)

[noreply] Improve concrete error message (#22536)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk f4bd7b7236fdf4ca8068d8c42c6c7023646c015d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1811837316239827977.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0730150359 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ijpyj32mavppg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #780

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/780/display/redirect?page=changes>

Changes:

[chamikaramj] Remove unnecessary reference to use_runner_v2 experiment in x-lang

[yixiaoshen] Fix typo in Datastore V1ReadIT test

[noreply] Relax the google-api-core dependency. (#22513)

[noreply] Bump google.golang.org/protobuf from 1.28.0 to 1.28.1 in /sdks (#22517)

[noreply] Bump google.golang.org/api from 0.89.0 to 0.90.0 in /sdks (#22518)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f4bd7b7236fdf4ca8068d8c42c6c7023646c015d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f4bd7b7236fdf4ca8068d8c42c6c7023646c015d # timeout=10
Commit message: "Bump google.golang.org/api from 0.89.0 to 0.90.0 in /sdks (#22518)"
 > git rev-list --no-walk c6624c36cbbbc94f78ab1fd4660efd8132fa1952 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2865740341833837237.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0729150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ih5siw4r67grm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #779

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/779/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] remove beam-summit 2022 container with all related files

[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read

[noreply] Upgrades pip before installing Beam for Python default expansion service

[noreply] [Go SDK]: Plumb allowed lateness to execution (#22476)

[Valentyn Tymofieiev] Restrict google-api-core

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Replace distutils with supported modules. (#22456)

[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)

[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.

[noreply] Remove stripping of step name. Replace removing only suffix step name

[noreply] Add read/write PubSub integration example fhirio pipeline (#22306)

[noreply] Remove deprecated Session runner (#22505)

[noreply] Add Go test status to the PR template (#22508)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c6624c36cbbbc94f78ab1fd4660efd8132fa1952 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c6624c36cbbbc94f78ab1fd4660efd8132fa1952 # timeout=10
Commit message: "Add Go test status to the PR template (#22508)"
 > git rev-list --no-walk 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1487969989776985474.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0728150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6pw4zjsoqcayw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #778

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/778/display/redirect?page=changes>

Changes:

[chamikaramj] Adds KV support for the Java RunInference transform.

[noreply] Replace distutils with supported modules. (#21968)

[noreply] Revert "Replace distutils with supported modules. " (#22453)

[noreply] Enable configuration to avoid successfully written Table Row propagation

[noreply] lint fixes for recent import (#22455)

[noreply] Bump Python Combine LoadTests timeout to 12 hours (#22439)

[noreply] convert windmill min timestamp to beam min timestamp (#21915)

[noreply] [CdapIO] Fixed necessary warnings (#22399)

[noreply] [#22051]: Add read_time support to Google Cloud Datastore connector

[noreply] 21730 fix offset resetting (#22450)

[noreply] Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 # timeout=10
Commit message: "Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)"
 > git rev-list --no-walk 5141ad8790a57e2fa62af607f32736e3eed399e3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6808241897335409012.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0727150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uenjw4fqrxmaa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #777

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/777/display/redirect?page=changes>

Changes:

[Steve Niemitz] Fix overly aggressive null check in RowWriterFactory

[bulat.safiullin] add executeAsTemplate to head, head_homepage, add absURL to page-nav.js,

[noreply] Bump cloud.google.com/go/bigquery from 1.35.0 to 1.36.0 in /sdks

[noreply] Disallow EventTimes in iterators (#22435)

[noreply] Update the upper bound for google-cloud-recommendations-ai. (#22398)

[noreply] LoadTestsBuilder: Disallow whitespace in option values (#22437)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5141ad8790a57e2fa62af607f32736e3eed399e3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5141ad8790a57e2fa62af607f32736e3eed399e3 # timeout=10
Commit message: "Merge pull request #21949: [WEBSITE] fix relative paths bug on staging in js files"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins700630480883230391.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0726150417 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7sxtxcjn3hrcu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #776

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/776/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1023090805032605811.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0725150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/iukbw53ol3qhw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #775

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/775/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1843953169948731700.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0724150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/og7dqrp5mqoyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #774

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/774/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] change getting window width method

[noreply] Bump cloud.google.com/go/storage from 1.23.0 to 1.24.0 in /sdks (#22377)

[Pablo Estrada] Removing experimental annotation from JdbcIO

[noreply] Drop timeseries:postCommit dependency (#22414)

[noreply] Deduplicate identical environments in a pipeline. (#22308)

[noreply] Skip failing torch post commit test (#22418)

[noreply] Log level fix on local runner (#22420)

[noreply] Update element_type inference (default_type_hints) for batched DoFns

[noreply] Remove spaces in experiments (#22423)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6289864065729569260.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0723150409 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v7fwgorqkoegc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #773

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/773/display/redirect?page=changes>

Changes:

[balazs.nemeth] BEAM-14525 Fix for Protobuf getter/setter method name discovery issue

[balazs.nemeth] BEAM-14525 Added a proto message with the problematic properties to use

[balazs.nemeth] PR CR: updating issue links

[noreply] Add accept-language header for MPL license (#22395)

[noreply] Bump terser from 5.9.0 to 5.14.2 in

[noreply] Fixes #22156: Fix Spark3 runner to compile against Spark 3.2/3.3 and add

[Moritz Mack] Closes #22407: Separate sources for SparkStructuredStreamingRunner for

[Moritz Mack] Add deprecation warning for Spark 2 in SparkStructuredStreamingRunner


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 # timeout=10
Commit message: "Merge pull request #22408 from mosche/22407-separate-spark-ssrunner-sources"
 > git rev-list --no-walk 50346b5d1414f671a60f117e0f50a0c16172afb7 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5568699033481057182.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0722150417 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dmc6fexp5f7lo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #772

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/772/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Support combiner lifting.

[noreply] Bump google.golang.org/api from 0.87.0 to 0.88.0 in /sdks (#22350)

[Robert Bradshaw] More clarification.

[noreply] [CdapIO] HasOffset interface was implemented (#22193)

[noreply] added olehborysevych as collaborator (#22391)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 50346b5d1414f671a60f117e0f50a0c16172afb7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 50346b5d1414f671a60f117e0f50a0c16172afb7 # timeout=10
Commit message: "added olehborysevych as collaborator (#22391)"
 > git rev-list --no-walk 4821e035c148df1ed7eb9e7054e47fe2a7003a1f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6040133883200477665.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0721150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zt73satenzsr4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #771

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/771/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Require unique names for stages.

[noreply] cleaned up types in standard_coders.ts (#22316)

[noreply] JMH module for sdks:java:core with benchmarks for

[noreply] Bump cloud.google.com/go/pubsub from 1.23.1 to 1.24.0 in /sdks (#22332)

[Luke Cwik] [#22181] Fix java package for SDK java core benchmark

[Luke Cwik] Allow jmhTest to run concurrently with other jmhTest instances

[noreply] [BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4821e035c148df1ed7eb9e7054e47fe2a7003a1f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4821e035c148df1ed7eb9e7054e47fe2a7003a1f # timeout=10
Commit message: "[BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)"
 > git rev-list --no-walk efde3f174c7ac502b24116d308249af52db52a2c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5505139809395329978.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0720150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rr7mkgawmou6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #770

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/770/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14117] Unvendor bytebuddy dependency (#17317)

[noreply] Use npm ci instead of install in CI (#22323)

[noreply] Fix typo in use_single_core_per_container logic. (#22318)

[noreply] [#22319] Regenerate proto2_coder_test_messages_pb2.py manually (#22320)

[noreply] Add links to the new RunInference content to Learning Resources (#22325)

[noreply] Unskip RunInference IT tests (#22324)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision efde3f174c7ac502b24116d308249af52db52a2c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f efde3f174c7ac502b24116d308249af52db52a2c # timeout=10
Commit message: "Unskip RunInference IT tests (#22324)"
 > git rev-list --no-walk 799eed2cc38ed6319d7b54a3ee0114c539d0f0af # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1118994328673007428.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0719150356 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2mlhnzlb7qiie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #769

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/769/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [website] Add TPC-DS benchmark documentation

[noreply] Increase streaming server timeout  (#22280)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 799eed2cc38ed6319d7b54a3ee0114c539d0f0af (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 799eed2cc38ed6319d7b54a3ee0114c539d0f0af # timeout=10
Commit message: "Merge pull request #22047: [website] Add TPC-DS benchmark documentation"
 > git rev-list --no-walk 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4603993909006408682.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0718150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w62ovapgst73u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #768

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/768/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
Commit message: "Merge pull request #22259 from akvelon/pg-trigger-deploy-examples"
 > git rev-list --no-walk 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins648330945179033794.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0717150354 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xkn2247pyrdjo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #767

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/767/display/redirect?page=changes>

Changes:

[noreply] Bump protobufjs from 6.11.2 to 6.11.3 in /sdks/typescript

[vlad.matyunin] enabled multifile flag for multifile examples (PG)

[Robert Bradshaw] Don't try to parse non-flags as retained pipeline options.

[chamikaramj] Enables UnboundedSource wrapped SDF Kafka source by default for x-lang

[noreply] Merge pull request #22140 from [Playground Task] Sharing any code API

[bulat.safiullin] [Website] add playground section, update playground, update get-started

[noreply] RunInference documentation updates. (#22236)

[noreply] Turn pr bot on for remaining common labels (#22257)

[noreply] Reviewing the RunInference ReadMe file for clarity. (#22069)

[noreply] Collect heap profile on OOM on Dataflow (#22225)

[noreply] fixing the missing wrap around ring range read (#21786)

[noreply] Update RunInference documentation (#22250)

[noreply] Rewrote Java multi-language pipeline quickstart (#22263)

[noreply] Merge pull request #22300 from Fixed [Playground] DeployExamples,


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
Commit message: "Merge pull request #22259 from akvelon/pg-trigger-deploy-examples"
 > git rev-list --no-walk 673a4cc793036050596aa340d91f26b461cb88e5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1408643837133180674.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0716150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4memxzpklbbls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #766

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/766/display/redirect?page=changes>

Changes:

[vitaly.terentyev] [BEAM-14101] Add Spark Receiver IO package and ReceiverBuilder

[egalpin] Moves timestamp skew override to correct place

[egalpin] Adds TestStream to verify window preservation of ElasticsearchIO#write

[egalpin] Removes unnecessary line

[Heejong Lee] [BEAM-22229] Override external SDK container URLs for Dataflow by

[egalpin] Adds validation that ES#Write outputs are in expected windows

[egalpin] Updates window verification test to assert the exact docs in the window

[egalpin] Uses guava Iterables over shaded avro version

[danthev] Fix query retry in Java FirestoreIO.

[noreply] Pg auth test (#22277)

[noreply] [BEAM-14073] [CdapIO] CDAP IO for batch plugins: Read, Write. Unit tests

[Heejong Lee] update

[noreply] [Fix #22151] Add fhirio.Deidentify transform (#22152)

[noreply] Remove locks around ExecutionStateSampler (#22190)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 673a4cc793036050596aa340d91f26b461cb88e5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 673a4cc793036050596aa340d91f26b461cb88e5 # timeout=10
Commit message: "Merge pull request #22183 from egalpin/egalpin/timestamp-skew-es"
 > git rev-list --no-walk 67e6726ffeb47d2ada0122369fa230833ce0f026 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4326759813939225156.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0715150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wlcbeweytc7y2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #765

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/765/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14506] Adding testcases and examples for xlang Python RunInference

[Heejong Lee] update

[Heejong Lee] update

[noreply] Move youngoli to the reviewer exclusion list (#22195)

[noreply] Bump google.golang.org/api from 0.86.0 to 0.87.0 in /sdks (#22253)

[noreply] Bump cloud.google.com/go/bigquery from 1.34.1 to 1.35.0 in /sdks

[noreply] Bump google.golang.org/grpc from 1.47.0 to 1.48.0 in /sdks (#22252)

[noreply] Merge pull request #15786: Add gap-filling transform for timeseries

[chamikaramj] Adds an experiment that allows opting into using Kafka SDF-wrapper

[noreply] Defocus iframe on blur or mouseout (#22153) (#22154)

[noreply] Fix pydoc rendering for annotated classes (#22121)

[noreply] Fix typo in comment (#22266)

[noreply] Split words on new lines or spaces (#22270)

[noreply] Replace \r\n, not just \n


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 67e6726ffeb47d2ada0122369fa230833ce0f026 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 67e6726ffeb47d2ada0122369fa230833ce0f026 # timeout=10
Commit message: "Replace \r\n, not just \n"
 > git rev-list --no-walk fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5568788148525185145.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0714150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xmpbwqdisdgrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #764

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/764/display/redirect?page=changes>

Changes:

[naireenhussain] add new pubsub urn

[Pablo Estrada] Several requests to show experiments in Dataflow UI

[byronellis] Add org.pentaho to calcite relocated packages to fix vendoring

[noreply] Adding VladMatyunin as collaborator (#22239)

[noreply] Mark session runner as deprecated (#22242)

[noreply] Update google-cloud-core dependency to <3 (#22237)

[noreply] Move WC integration test to generic registration (#22248)

[noreply] Move Xlang Go examples to generic registration (#22249)

[noreply] Move Go Primitives Integration Tests to Generic Registration (#22247)

[noreply] Move native Go examples to generic registration (#22245)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 # timeout=10
Commit message: "Move native Go examples to generic registration (#22245)"
 > git rev-list --no-walk a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9007469392348148553.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0713150359 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/k45gbekyxc4yc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #763

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/763/display/redirect?page=changes>

Changes:

[noreply] Split checkStyle from precommit into spotless job (#22203)

[noreply] Allow one to bound the size of output shards when writing to files.

[noreply] Bump moment from 2.29.2 to 2.29.4 in

[noreply] Allow BigQuery TableIds to have space in between (#22167)

[noreply] Use async as a suffix rather than a prefix for asynchronous variants.

[noreply] Override log levels after log handler is created (#22191)

[noreply] Remove deprecated unused option in seed job script (#22223)

[noreply] Better error for external BigQuery tables. (#22178)

[noreply] Try to fix playground workflow (#22226)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf # timeout=10
Commit message: "Try to fix playground workflow (#22226)"
 > git rev-list --no-walk 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5877738462245646909.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0712150510 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ccoryyrzh2asw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #762

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/762/display/redirect?page=changes>

Changes:

[noreply] Parallelizable DataFrame/Series mean (#22174)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 # timeout=10
Commit message: "Parallelizable DataFrame/Series mean (#22174)"
 > git rev-list --no-walk 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins374731059456405510.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cbpfed4p75ijq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #761

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/761/display/redirect?page=changes>

Changes:

[noreply] Add typescript documentation to the programing guide. (#22137)

[noreply] [Website] Update minimum required Go version for sdk development


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d # timeout=10
Commit message: "[Website] Update minimum required Go version for sdk development (#22210)"
 > git rev-list --no-walk 4b4077dc8828452e6a49b1bc00db2fa551e453fb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6236056406817245835.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hb76csa3a6iw2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #760

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/760/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] change case studies link from staging to relative path

[bulat.safiullin] [Website] add I/O Connectors link to dropdown list, updating link to

[noreply] Merge pull request #22096 from [Playground] Infrastructure for sharing

[noreply] Support dependencies and remote registration in the typescript SDK.

[noreply] [BEAM-13015, #22050] Make SDK harness msec counters faster using ordered

[yathu] Fix build error due to dep confliction of google-cloud-bigquery-storage

[yathu] Fix atomicwrites old version purge on pypi

[noreply] Fix default type inference of CombinePerKey. (#16351)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b4077dc8828452e6a49b1bc00db2fa551e453fb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b4077dc8828452e6a49b1bc00db2fa551e453fb # timeout=10
Commit message: "Merge pull request #22205 Fix build error due to dep confliction of google-cloud-bigquery-storage and google-cloud-core"
 > git rev-list --no-walk d44c0440bc91f8fd63dcd082c2acf50b40e7af1b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5127195298608250411.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zfv37czrpgf7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #759

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/759/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add refresh to page-nav.js

[relax] set timestamp when outputting finalize element

[alexey.inkin] Declarative theming, Remove duplicate PlaygroundState for embedded page,

[yathu] Fix Hadoop upload corrupted due to buffer reuse

[benjamin.gonzalez] Fix testKafkaIOReadsAndWritesCorrectlyInStreaming failing for kafka

[noreply] Add `schema_options` and `field_options` on RowTypeConstraint (#22133)

[noreply] Optimize locking in several critical-path methods (#22162)

[noreply] Deprecate AWS IOs (Java) using AWS SDK v1 in favor of IOs in

[noreply] Update Go BPG xlang documentation to include Java automated service


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d44c0440bc91f8fd63dcd082c2acf50b40e7af1b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d44c0440bc91f8fd63dcd082c2acf50b40e7af1b # timeout=10
Commit message: "Update Go BPG xlang documentation to include Java automated service start-up (#22187)"
 > git rev-list --no-walk df162c1e2fb221c64cd861605fb35b37d2e6b8ec # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5519208075142530380.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qiwzuygmt4i4u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #758

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/758/display/redirect?page=changes>

Changes:

[noreply] Enable passing tests on dataflow runner v2. (#22136)

[noreply] Merge pull request #17727 from [BEAM-9482] Fix "provided port is already

[noreply] Fix date for go 2.40 blog post

[noreply] Fix month for 2.40 go blog post

[noreply] [BEAM-14545] Optimize copies in dataflow v1 shuffle reader. (#17802)

[noreply] Tune StreamingModeExecutionContext allocations (#22142)

[noreply] [BEAM-3221] Improve documentation around split request and response

[noreply] Fix documentation about hand implemented global aggregations (#22173)

[noreply] Merge pull request #21872 from Standardizing output of WriteToBigQuery

[noreply] Propogate error messages from GcsUtil (#22079)

[noreply] Reenable Jenkins comment triggers (#22169)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision df162c1e2fb221c64cd861605fb35b37d2e6b8ec (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f df162c1e2fb221c64cd861605fb35b37d2e6b8ec # timeout=10
Commit message: "Reenable Jenkins comment triggers (#22169)"
 > git rev-list --no-walk 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2838735083460005702.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ipvnewhrxaydq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #757

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/757/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11103] Add blog post for go 2.40 release (#17723)

[noreply] Fix test_row_coder_fail_early_bad_schema fails run after

[noreply] Tune ByteStringCoder allocations (#22144)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 # timeout=10
Commit message: "Tune ByteStringCoder allocations (#22144)"
 > git rev-list --no-walk b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins184951323795895140.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yniw45ai7oefk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #756

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/756/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
Commit message: "Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)"
 > git rev-list --no-walk b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8600529995593285418.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0705150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gwtgcon5sbt4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #755

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/755/display/redirect?page=changes>

Changes:

[noreply] Go SDK: Update memfs to parse the List() pattern as a glob, not a regexp

[noreply] Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
Commit message: "Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)"
 > git rev-list --no-walk 85e8149cbcebc4a6b07d09501f96dfaec95c73bc # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6621171985168136527.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0704150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mp2yzbwihx5oq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #754

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/754/display/redirect?page=changes>

Changes:

[noreply] Sharding IO tests(amazon web services and amazon web services 2) from


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 85e8149cbcebc4a6b07d09501f96dfaec95c73bc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 85e8149cbcebc4a6b07d09501f96dfaec95c73bc # timeout=10
Commit message: "Sharding IO tests(amazon web services and amazon web services 2) from java post commit task (#21808)"
 > git rev-list --no-walk eb5b7cc256d8d15173475cf51af758979a33bd16 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2341324610393051310.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0703150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/e2c2xbt4mcye2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #753

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/753/display/redirect?page=changes>

Changes:

[noreply] Python: Use RowTypeConstraint for normalizing all schema-inferrable user

[noreply] changing nameBase value to Java_GCP_IO_Direct (#22128)

[noreply] Bump dataflow fnapi java sdk version (#22127)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision eb5b7cc256d8d15173475cf51af758979a33bd16 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f eb5b7cc256d8d15173475cf51af758979a33bd16 # timeout=10
Commit message: "Bump dataflow fnapi java sdk version (#22127)"
 > git rev-list --no-walk 680ed5b3a49990e2de0730b49233dfe22cfe9b8f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1910419592067663947.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0702150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rwbkvms46kzv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #752

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/752/display/redirect?page=changes>

Changes:

[alexey.inkin] Do not re-create PlaygroundState (#21950)

[Moritz Mack] Deprecate runner support for Spark 2.4 (closes #22094)

[noreply] Fixes #21698: Use normal Container snapshots for Go Load Tests (#22102)

[noreply] Change default, options, and explanation for issue priority (#22116)

[noreply] Minor: Bump flake8 to 4.0.1 (#22110)

[noreply] Add sdk_harness_log_level_overrides option for python sdk (#22077)

[noreply] Fix typo in Pytorch Bert Language Modeling (#22114)

[noreply] Fix #21977: Add Search transform to Go FhirIO (#21979)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 680ed5b3a49990e2de0730b49233dfe22cfe9b8f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 680ed5b3a49990e2de0730b49233dfe22cfe9b8f # timeout=10
Commit message: "Merge pull request #22097 from mosche/22094-DeprecateSpark2"
 > git rev-list --no-walk dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4379013325862009552.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0701150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sgjwzorm4ocwk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #751

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/751/display/redirect?page=changes>

Changes:

[alexey.inkin] Add an abstract layer for analytics, fix logging change of snippet, fix

[bulat.safiullin] [Website] add scroll-spy to body in case-studies/baseof.html

[noreply] [BEAM-6597] Replace ProgressRequestCallback with BundleProgressReporter

[noreply] [Go SDK] Go Lint fixes  (#21967)

[noreply] Fix #21869: Close GRPC connections on cancel (#21874)

[noreply] Add FlatMap(<builtin>) known issue to 2.40.0 blog (#22101)

[noreply] [BEAM-14347] Update docs to prefer generic registration functions

[Andrew Pilloud] Projection Pushdown optimizer on by default

[noreply] Merge pull request #21752 from Feature/beam 13852 reimplement with

[noreply] Change wording of Pytorch LM example (#22099)

[noreply] Fix missing model_params in Pytorch docstring  (#22100)

[noreply] Test and fix FlatMap(<builtin>) issue (#22104)

[noreply] Fix InputStream on platform with 4 bytie long (#22107)

[noreply] [BEAM-14187] Fix NPE at initializeForKeyedRead in IsmReaderImpl (#22111)

[noreply] Remove unused legacy dataflow translate code (#22019)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d # timeout=10
Commit message: "Remove unused legacy dataflow translate code (#22019)"
 > git rev-list --no-walk 340b4217639753e7b16dedce29916491644a6c82 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8376766846722049186.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0630154742 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p52mn5uq2fbdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #750

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/750/display/redirect?page=changes>

Changes:

[damondouglas] Implement PubsubSchemaTransformMessageToFactory

[noreply] sharding GCP IO tests from the javaPostCommit task (#21800)

[noreply] Bump cloud.google.com/go/storage from 1.22.1 to 1.23.0 in /sdks (#22038)

[noreply] Followup sharding javaPostCommit (#22081)

[noreply] remove mention of dill in release notes as it's not relevant. (#22087)

[noreply] [#21634] Add comments on FieldValueGetter. (#21982)

[noreply] Bump google.golang.org/api from 0.85.0 to 0.86.0 in /sdks (#22092)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 340b4217639753e7b16dedce29916491644a6c82 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 340b4217639753e7b16dedce29916491644a6c82 # timeout=10
Commit message: "Bump google.golang.org/api from 0.85.0 to 0.86.0 in /sdks (#22092)"
 > git rev-list --no-walk a3b3182e38fe6b2152f371d4232ddc5d22feed71 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5850094982069037567.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0629153504 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 4 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/laed2rjm4df6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #749

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/749/display/redirect?page=changes>

Changes:

[Pablo Estrada] Blog post and updates for release 2.40.0

[noreply] 22011 remove checks on client.close() except when

[noreply] update flutter version to 3.0.1-stable (#22062)

[noreply] Add randomness to integration test job names to avoid collisions

[noreply] Give @pcoet triage permission (#22068)

[noreply] Issue#20877 Updated Interactive Beam README (#22034)

[noreply] Update issue bot to javascript and add label management (#22067)

[noreply] Clean up issue management doc page

[noreply] [BEAM-13015, #21250, fixes #22053] Improve PCollectionConsumerRegistry


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a3b3182e38fe6b2152f371d4232ddc5d22feed71 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a3b3182e38fe6b2152f371d4232ddc5d22feed71 # timeout=10
Commit message: "[BEAM-13015, #21250, fixes #22053] Improve PCollectionConsumerRegistry performance by swapping element count and sampled byte size to use a faster counter. (#22002)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4563956605503559421.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0628150946 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ovngilfrl2xgk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #748

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/748/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2168118793752211353.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0627151135 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/drmyc2zf3mrjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #747

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/747/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2840747005718017721.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0626150939 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jyjbynp4oyo7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #746

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/746/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Use WindowedValue.withValue rather than WindowedValue.of in

[Robert Bradshaw] [BEAM-14464] More efficient grouping keys in precombiner table.

[Robert Bradshaw] fix compile after merge

[Robert Bradshaw] spotless

[Robert Bradshaw] Only flush every Nth element.

[Robert Bradshaw] spotless

[Robert Bradshaw] Post-merge fix.

[Robert Bradshaw] Fix test expectations.

[bulat.safiullin] [Website] add guard expressions to fix-menu and page-nav

[noreply] Unify to a single issue report (#22045)

[noreply] Remove colon in issue report

[noreply] Bump cloud.google.com/go/pubsub from 1.22.2 to 1.23.0 in /sdks (#22036)

[noreply] Fix vendored dependency issue and other style checks (#22046)

[noreply] Bump shell-quote (#21983)

[noreply] Revert "[BEAM-13590]Update Pytest version to support Python 3.10

[noreply] Bump cloud.google.com/go/bigquery from 1.32.0 to 1.34.1 in /sdks

[noreply] Bump github.com/spf13/cobra from 1.4.0 to 1.5.0 in /sdks (#21955)

[yathu] checkStlye Fix: remove redundant static and public in interface. camel

[noreply] Fix DEADLINE_EXCEEDED flakiness  (#22035)

[noreply] Fix SpannerIO flakes (#22023)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 10dab960d9695266fbbbeb040a378550fb440be6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8356857408503159363.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0625150913 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rwukxvpxalkgk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #745

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/745/display/redirect?page=changes>

Changes:

[noreply] Canonicalize standard_coders.yaml booleans

[noreply] Followup fix FileIOTest.testMatchWatchForNewFiles flaky (#21877)

[noreply] Fix links for issue report (#22033)

[noreply] Merge pull request #21953 from Implement

[noreply] Enable close issue as not planned (#22032)

[noreply] Rename README.md to ACTIONS.md (#22043)

[noreply] Removes examples of unscalable sinks from documentation. (#22020)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 10dab960d9695266fbbbeb040a378550fb440be6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 10dab960d9695266fbbbeb040a378550fb440be6 # timeout=10
Commit message: "Removes examples of unscalable sinks from documentation. (#22020)"
 > git rev-list --no-walk dc0b5e40417ad6c63890fef89d770a0606ce7282 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8434804929758533913.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0624151040 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/scow6vy43xyxe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #744

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/744/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Streaming-related runner fixes.

[Robert Bradshaw] Improvements to auto-started services.

[Robert Bradshaw] Fix version, asserts for remote execution.

[Robert Bradshaw] Add IO dependencies.

[Robert Bradshaw] Add several cross-language IOs.

[Robert Bradshaw] Disable tests that require new release is required for out-of-the-box

[rszper] Correcting the regex for the Dataflow job name.

[noreply] Merge pull request #21981 from [Playground] Upgrade Flutter linter, fix

[andyye333] Move wrapper class outside run()

[noreply] Clean up redundant articles, prepositions, conjunctions appeared

[noreply] Fix FlatMap numpy array bug (#22006)

[Robert Bradshaw] More strongly typed outputs.

[noreply] Fix issues with test ordering (#21986)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dc0b5e40417ad6c63890fef89d770a0606ce7282 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dc0b5e40417ad6c63890fef89d770a0606ce7282 # timeout=10
Commit message: "Fix issues with test ordering (#21986)"
 > git rev-list --no-walk 242f8f3ffe4802bce130403690241fcab0bd7281 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2093238737971764951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0623150957 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5ygezskbqmfqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #743

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/743/display/redirect?page=changes>

Changes:

[yiru] fix: Add a retry code to insertall retry policy

[johnjcasey] 21742 add warning for risky kafka configuration

[johnjcasey] 21742 run spotless

[noreply] Fix target email for flaky test/p0/p1 reports

[noreply] Add unit testing for graphx/user.go (#21962)

[bulat.safiullin] [Website] add lyft to quote cards on homepage, use relative paths for

[noreply] Update documentations and document generation (#21965)

[noreply] Add ExecuteBundles transform to Go FhirIO (#21840)

[noreply] Bump cloud.google.com/go/datastore from 1.6.0 to 1.8.0 in /sdks (#21973)

[noreply] Bump google.golang.org/api from 0.83.0 to 0.85.0 in /sdks (#21974)

[noreply] [Go SDK] Adds a snippet for GBK in BPG (#21842)

[noreply] Update parameterized requirement in /sdks/python (#21975)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 242f8f3ffe4802bce130403690241fcab0bd7281 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 242f8f3ffe4802bce130403690241fcab0bd7281 # timeout=10
Commit message: "Update parameterized requirement in /sdks/python (#21975)"
 > git rev-list --no-walk 75cba1085a3f6f069d78096f3a3eb95076129525 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins560785206642379670.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0622150950 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ma73lmbx227q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #742

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/742/display/redirect?page=changes>

Changes:

[noreply] Modified KafkaIO.Read SDF->Legacy forced override to fail if configured

[noreply] [BEAM-13590]Update Pytest version to support Python 3.10 (#17791)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 75cba1085a3f6f069d78096f3a3eb95076129525 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 75cba1085a3f6f069d78096f3a3eb95076129525 # timeout=10
Commit message: "[BEAM-13590]Update Pytest version to support Python 3.10 (#17791)"
 > git rev-list --no-walk 0ef5d3a185c1420da118208353ceb0b40b3a27c9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8362676324261311956.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0621152505 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zuld3cmplwnyu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #741

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/741/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Flink job


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0ef5d3a185c1420da118208353ceb0b40b3a27c9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0ef5d3a185c1420da118208353ceb0b40b3a27c9 # timeout=10
Commit message: "Merge pull request #21747: [BEAM-12918] Add PostCommit_Java_Tpcds_Flink job"
 > git rev-list --no-walk de5c56a5b8a8a030e7e67323a696d52495e37f7f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1545143291348946793.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0620150947 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3fbd4hezh3uzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #740

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/740/display/redirect?page=changes>

Changes:

[Pablo Estrada] Removing playground from main page to remove scrolling issue

[noreply] Merge pull request #21940 from [21941] Fix no output timestamp case


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision de5c56a5b8a8a030e7e67323a696d52495e37f7f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f de5c56a5b8a8a030e7e67323a696d52495e37f7f # timeout=10
Commit message: "Merge pull request #21940 from [21941] Fix no output timestamp case"
 > git rev-list --no-walk 525a169e6f807e301f1ac5e039645d4961da18d7 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8547657782720283792.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0619150848 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cpfqzuzkdzli4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #739

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/739/display/redirect?page=changes>

Changes:

[yathu] Unsickbay copy_rewrite_token tests

[Kenneth Knowles] Suppress unneeded spotbugs unused store warnings

[Kenneth Knowles] Eliminate nullness errors in KafkaIO

[yathu] Fix beam_PostCommit_Java_Sickbay build

[bulat.safiullin] [Website] add publishdate attribute to frontmatter

[noreply] Add guidance on self-assigning/closing to issue templates (#21931)

[noreply] Update names.py

[noreply] [Website] add new case-study, fix styles, add related images (#21891)

[noreply] Merge pull request #21928 from [Fixes #21927] Compress

[noreply] BigQueryIO: Adding the BASIC view setting to getTable request  (#21879)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 525a169e6f807e301f1ac5e039645d4961da18d7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 525a169e6f807e301f1ac5e039645d4961da18d7 # timeout=10
Commit message: "Merge pull request #21933 from Update container tags used by Dataflow runner with unreleased SDKs"
 > git rev-list --no-walk b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3483872363968078278.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0618150548 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gmwc7elo27pno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #738

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/738/display/redirect?page=changes>

Changes:

[yathu] [BEAM-3177][BEAM-5468] Add pipeline options to set default logging level

[noreply] Remove dataframe warnings from py38-docs logs (#21861)

[noreply] Update references to Jira to GH for the Java SDK (#21836)

[noreply] [21709] - Fix for "beam_PostCommit_Java_ValidatesRunner_Samza Failing"

[noreply] Update references to jira to GH for the Runners (#21835)

[noreply] Update remaining references to Jira to GH (#21834)

[ahmedabualsaud] test fixes

[ahmedabualsaud] no need for this line

[Kenneth Knowles] Re-activate nullness checking for some of sdks/java/core/coders

[noreply] Expand pr bot to python (#21791)

[noreply] Update run inference documentation (#21921)

[noreply] Consider skipped checks successful (#21924)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 # timeout=10
Commit message: "Consider skipped checks successful (#21924)"
 > git rev-list --no-walk bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8174678518754373345.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0617150541 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zwwg44uhvkjue

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #737

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/737/display/redirect?page=changes>

Changes:

[naireenhussain] convert windmill min timestamp to beam min timestamp

[nielm] Add Spanner Integration tests to verify exception handling

[egalpin] Drops usage of setWindowingStrategyInternal in favour of direct use of

[noreply] Switch go todos from issue # syntax to links (#21890)

[Valentyn Tymofieiev] Rollback dill.

[noreply] Add Pytorch image segmentation example (#21766)

[noreply] Add README documentation for scikit-learn MNIST example (#21887)

[noreply] Decompose labels for new issues (#21888)

[noreply] Use Go 1.18 for go-licenses (#21896)

[egalpin] Gives unique names to ES IO Write windowing

[noreply] [BEAM-12903] Cron job to cleanup Dataproc leaked resources (#21779)

[noreply] [BEAM-7209][BEAM-9351][BEAM-9428] Upgrade Hive to version 3.1.3 (#17749)

[noreply] Sharding IO tests (Kafka, Debezium, JDBC, Kinesis, Neo4j) from the

[noreply] Merge pull request #17604 from [BEAM-14315] Match updated files

[noreply] Merge pull request #21781 from Sklearn Mnist example and IT test

[Pablo Estrada] Update Python base image requirements

[noreply] Get the latest version of go-licenses (#21901)

[noreply] Hide internal helpers added to DoFn for batched DoFns (#21860)

[noreply] Updated documentation for ml.inference docs. (#21868)

[Pablo Estrada] Moving to 2.41.0-SNAPSHOT on master branch.

[noreply] Add a type hint to nexmark query 3 joinFn (#21873)

[Kenneth Knowles] Revert "convert windmill min timestamp to beam min timestamp"

[noreply] Fix a few small config issues (#21909)

[dannymccormick] Update py to python label

[noreply] Daily p0/p1/flaky reports for issues (#21725)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 # timeout=10
Commit message: "Daily p0/p1/flaky reports for issues (#21725)"
 > git rev-list --no-walk 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4619891630070967870.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0616153037 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 4s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/co5ggsl4fcimm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #736

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/736/display/redirect?page=changes>

Changes:

[nielm] Add transform names to help debug flaky test

[dannymccormick] Mark issues as triaged when they are assigned

[chamikaramj] Automatically enable Runner v2 for pipelines that use cross-language

[bulat.safiullin] [BEAM-13229] side nav bug fixed

[bulat.safiullin] fix links for pipelines

[noreply] Split PytorchModelHandler into PytorchModelHandlerTensor and

[noreply] Fix Hadoop Downloader Range not correct (#21778)

[noreply] [BEAM-14036] Read Configuration for Pub/Sub SchemaTransform (#17730)

[noreply] [Go SDK] Add more info to Worker Status API (#21776)

[noreply] Make PeriodicImpulse generates unbounded PCollection (#21815)

[noreply] [BEAM-14267] Update watchForNewFiles to allow watching updated files

[noreply] fix timestamp conversion in Google Cloud Datastore Connector (#17789)

[noreply] Update references to Jira to GH for the Go label (#21830)

[noreply] [#21853] Adjust Go cross-compile to target entire package (#21854)

[Kenneth Knowles] Adjust Jenkins configuration to allow more memory per JVM

[noreply] [BEAM-14553] Add destination coder to FileResultCoder components

[noreply] copyedited README for RunInference examples (#21855)

[noreply] Document and test overriding batch type inference (#21844)

[noreply] Update references to Jira to GH for the Python SDK (#21831)

[noreply] add highlights to changes (#21865)

[noreply] Merge pull request #21793: [21794 ] Fix output timestamp in Dataflow.

[noreply] Adding more info to the sdk_worker_parallelism description (#21839)

[noreply] Add Bert Language Modeling example (#21818)

[noreply] [BEAM-14524] Returning NamedTuple from RunInference transform (#17773)

[noreply] Unit tests for RunInference keyed/unkeyed Modelhandler and examples

[noreply] Remove kwargs and add explicit runinference_args (#21806)

[noreply] Modify README for 3 pytorch examples (#21871)

[noreply] Sickbay Pytorch example IT test (#21857)

[noreply] Add required=True to Pytorch image classification example (#21883)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea # timeout=10
Commit message: "Add required=True to Pytorch image classification example (#21883)"
 > git rev-list --no-walk 12ba4cea9d6a76a522106e6bb55f46fed091669f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5701935074059228796.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0615152621 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1m 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3hibnz2j6da22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #735

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/735/display/redirect?page=changes>

Changes:

[noreply] Bump cloud.google.com/go/pubsub from 1.21.1 to 1.22.2 in /sdks

[dannymccormick] Stop collecting jira metrics

[dannymccormick] Move to contains notation

[dannymccormick] fix query to get all updated issues

[noreply] Add RunInference API to CHANGES.md (#21754)

[Kenneth Knowles] Do not allow postcommit jobs phrase triggering

[noreply] Refactor API code to base.py in RunInference (#21801)

[noreply] Provide a diagnostic error message when a filesystem scheme is not

[Kiley Sok] Disable more triggers

[noreply] [BEAM-14532] Add integration testing to fhirio Read transform (#17803)

[noreply] Merge pull request #17794 from [#21252] Enforce pubsub message

[noreply] Separated pandas and numpy implementations of sklearn. (#21803)

[noreply] Composite triggers and unit tests for Go SDK (#21756)

[Kiley Sok] Enable phrase trigger for a few post commits

[Kiley Sok] spotless

[noreply] [BEAM-14557] Read and Seek Runner Capabilities in Go SDK  (#17821)

[noreply] [BEAM-13806] Add x-lang BigQuery IO integration test to Go SDK. (#16818)

[Jan Lukavský] [BEAM-14265] Add watermark hold for all timers

[noreply] Bump Python beam-master container (#21820)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 12ba4cea9d6a76a522106e6bb55f46fed091669f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 12ba4cea9d6a76a522106e6bb55f46fed091669f # timeout=10
Commit message: "Bump Python beam-master container (#21820)"
 > git rev-list --no-walk 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6825437849351840175.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0614152215 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 54s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7goqvhjtiohh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #734

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/734/display/redirect?page=changes>

Changes:

[noreply] Refactor code according to keyedModelHandler changes (#21819)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f # timeout=10
Commit message: "Refactor code according to keyedModelHandler changes (#21819)"
 > git rev-list --no-walk b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7121477853505068562.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/luxfibufsyzro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #733

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/733/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
Commit message: "Make keying of examples explicit. (#21777)"
 > git rev-list --no-walk b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2140576185273409715.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/423h3leped4qe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #732

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/732/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14535] Added support for pandas in sklearn inference runner

[noreply] Merge ModelLoader and InferenceRunner into same class. (#21795)

[noreply] Merge pull request #17589 from [BEAM-14422] Exception testing for

[noreply] Add README for image classification example (#21758)

[anandinguva98] fixup: bug

[noreply] Fix every PR linking to PR 123 (#21802)

[noreply] Add native PubSub IO prototype to Go (#17955)

[noreply] Allow creation of dynamically defined transforms in the Python expansion

[noreply] Make keying of examples explicit. (#21777)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
Commit message: "Make keying of examples explicit. (#21777)"
 > git rev-list --no-walk 0de98210f4531fbfd88265bc02052b27bd299602 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1812392021055503613.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5atanmbnsj2gw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #731

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/731/display/redirect?page=changes>

Changes:

[dannymccormick] Update dashboards to use gh data instead of jira data

[noreply] Merge pull request #21746: Exclude GCP Java packages from Dependabot

[noreply] Update .test-infra/metrics/grafana/dashboards/source_data_freshness.json

[noreply] Better cross langauge support for dataframe reads. (#21762)

[noreply] Add template_location flag to Go Dataflow runner (#21774)

[noreply] [BEAM-14406] Drain test for SDF truncation in Go SDK (#17814)

[noreply] More Jira -> Issues doc updates (#21770)

[noreply] [BEAM-11104] Add code snippet for Go SDK Self-Checkpointing (#17956)

[noreply] [BEAM-13769]Add no_xdist marker for cloudpickle test (#17538)

[noreply] [BEAM-14533] Bump cloudpickle to 2.1.0 (#17780)

[noreply] Add basic byte size estimation for batches (#17771)

[noreply] Add @yields_batches and @yields_elements (#19268)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0de98210f4531fbfd88265bc02052b27bd299602 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0de98210f4531fbfd88265bc02052b27bd299602 # timeout=10
Commit message: "Add @yields_batches and @yields_elements (#19268)"
 > git rev-list --no-walk 67533d17fd70c0c8994a3eb758b175dddfaea83b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2098260884467871742.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 16s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j7bmvhjhjmcz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #730

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/730/display/redirect?page=changes>

Changes:

[nishantjain] [BEAM-14000] Elastic search IO doesnot work when both username/password

[nishantjain] Fixes issue with httpclientbuilder - Use the existing builder instead of

[nishantjain] moves sslcontext towards starting of function

[nishantjain] adds unit test

[nishantjain] changes unit test to directly built restclient

[nishantjain] changes name of unit test

[nishantjain] adds test to all elasticsearch folder

[nishantjain] updates changes.md

[nishantjain] spotless fix

[dannymccormick] Gather metrics on GH Issues

[dannymccormick] Fixes

[dannymccormick] Fixes

[dannymccormick] Comment + naming fix

[dannymccormick] Conflicts fix

[dannymccormick] Ordering

[dannymccormick] Different fallback for prs/issues

[noreply] Add ability to self-assign issues for non-committers (#21719)

[dannymccormick] Fix sync time

[noreply] Dont try to generate jiras as part of dependency report (#21753)

[noreply] Allow users to comment `.take-issue` without taking (#21755)

[noreply] Merge pull request: [Beam-14528]: Add ISO time format support for

[noreply] Update all links to in progress jiras to issues (#21749)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 67533d17fd70c0c8994a3eb758b175dddfaea83b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 67533d17fd70c0c8994a3eb758b175dddfaea83b # timeout=10
Commit message: "Merge pull request #17297 from nishantjain91/elasticsearch_fix"
 > git rev-list --no-walk a1c3d0cd60d686196e8643ebf3eef9816a24b66a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7748109413361912237.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0609150610 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tzucxp7z24z7i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #729

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/729/display/redirect?page=changes>

Changes:

[nielm] Fix SpannerIO service call metrics and improve tests.

[andyye333] Add Pytorch support for batched keyed examples

[andyye333] Add general support for non-batchable kwargs params; Add

[noreply] [BEAM-12554] Create new instances of FileSink in sink_fn (#17708)

[noreply] DataflowRunner: Experiment added to disable unbounded PCcollection

[vachan] Fix for increased FAILED_PRECONDITION errors in BQ Read API.

[noreply] More flexible Python Callable type. (#17767)

[noreply] Fix typos in README (#17675)

[vachan] Adding comments.

[noreply] Bump google.golang.org/api from 0.81.0 to 0.83.0 in /sdks (#21743)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a1c3d0cd60d686196e8643ebf3eef9816a24b66a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a1c3d0cd60d686196e8643ebf3eef9816a24b66a # timeout=10
Commit message: "Bump google.golang.org/api from 0.81.0 to 0.83.0 in /sdks (#21743)"
 > git rev-list --no-walk e62ae391985fc13c7df1ee6e088525835ceaa560 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8236395090098188378.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0608150551 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3hbaqfspkgdt2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #728

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/728/display/redirect?page=changes>

Changes:

[yathu] [BEAM-14471] Fix PytestUnknownMarkingWarning

[Robert Bradshaw] Populate missing display data for remotely expanded transforms.

[Robert Bradshaw] Add an option to run Python operations in-line when invoked as a remote

[Robert Bradshaw] Pass options underlying runner in remote job service.

[noreply] Update Jira -> Issues in the Readme

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job

[noreply] Clean up uses of == instead of === in ts sdk (#17732)

[Robert Bradshaw] Comment, lint fixes.

[noreply] Mount GCP credentials in local docker environments. (#19265)

[noreply] [BEAM-14068]Add Pytorch inference IT test and example (#17462)

[noreply] [Playground] [Hotfix] Remove autoscrolling from embedded editor (#21717)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e62ae391985fc13c7df1ee6e088525835ceaa560 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e62ae391985fc13c7df1ee6e088525835ceaa560 # timeout=10
Commit message: "Merge pull request #17680: [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job"
 > git rev-list --no-walk 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8980759139499028493.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0607150601 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lfa7trpstnzww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #727

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/727/display/redirect?page=changes>

Changes:

[noreply] [Fixes #18679] Ensure that usage of metrics on a template job reports an


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee # timeout=10
Commit message: "[Fixes #18679] Ensure that usage of metrics on a template job reports an error (#18905)"
 > git rev-list --no-walk 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2505222853943532384.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0606150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/m3bg54tdq2rli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #726

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/726/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4dce7b8857f37608321253073745fe7611a48af9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
Commit message: "[BEAM-14556] Honor the formatter installed on the root handler. (#17820)"
 > git rev-list --no-walk 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2312229268456118203.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0605150544 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/f5ty5ppiq7lxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #725

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/725/display/redirect?page=changes>

Changes:

[Pablo Estrada] Revert "Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ

[noreply] Alias worker_harness_container_image to sdk_container_image (#17817)

[noreply] [BEAM-14546] Fix errant pass for empty collections in Count (#17813)

[noreply] Merge pull request #17741 from [BEAM-14504] Add support for including

[noreply] Merge pull request #18374 from [BEAM-13945] Roll forward JSON support

[noreply] Merge pull request #17792 from [BEAM-13756] [Playground] Merge Log and

[noreply] Merge pull request #17779: [BEAM-14529] Add integer to float64

[noreply] [BEAM-14556] Honor the formatter installed on the root handler. (#17820)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4dce7b8857f37608321253073745fe7611a48af9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
Commit message: "[BEAM-14556] Honor the formatter installed on the root handler. (#17820)"
 > git rev-list --no-walk 8e105977f963defeb9bbac5a94275cb356069c5a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6795697062723060415.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0604150559 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rjce5xw572n32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #724

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/724/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-14446] Update some docs to point to GitHub issues

[dannymccormick] More doc updates

[dannymccormick] Update issueManagement fields

[dannymccormick] Fix website build

[dannymccormick] Remove extraneous comment line

[noreply] Commit message guidance

[noreply] [BEAM-10976] Fix bug with bundle finalization on SDFs (and a small doc

[noreply] Bump google.golang.org/grpc from 1.46.2 to 1.47.0 in /sdks (#17806)

[noreply] Rename pytorch files (#17798)

[noreply] Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ

[noreply] [BEAM-11105] Add more watermark estimation docs for go (#17785)

[noreply] [BEAM-11106] documentation for SDF truncation in Go (#17781)

[noreply] [BEAM-11167] Updates dill package to version 0.3.5.1 (#17669)

[noreply] [BEAM-6258] Use gRPC 1.33.1 as min version to ensure that we pickup

[noreply] [BEAM-14441] Enable GitHub issues (#17812)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8e105977f963defeb9bbac5a94275cb356069c5a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8e105977f963defeb9bbac5a94275cb356069c5a # timeout=10
Commit message: "[BEAM-14441] Enable GitHub issues (#17812)"
 > git rev-list --no-walk 999bceab8e87d25f30faffe7d6431e2d8588663f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8484491092093905707.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0603150543 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pcqpbzhuywoyg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #723

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/723/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Fix parsing of -PenableCheckerFramework in build

[Kenneth Knowles] Fix additional nullness errors in BigQueryIO

[yathu] [BEAM-13984] followup Fix precommit

[noreply] [BEAM-14513] Add read transform and initial healthcare client (#17748)

[noreply] [BEAM-14536] Handle 0.0 splits in offsetrange restriction (#17782)

[noreply] [BEAM-14470] Use lifecycle method names directly. (#17790)

[noreply] [BEAM-14297] add nullable annotations and an integration test (#17742)

[noreply] Only generate Javadocs for latest Spark runner version (Spark 3) to fix

[noreply] Fail Javadoc aggregateJavadoc task if there's an error (#17801)

[noreply] Merge pull request #17753 from [BEAM-14510] adding exception tests to

[noreply] feat: allow for unknown values in change streams (#17655)

[noreply] Support JdbcIO autosharding in Python (#16921)

[noreply] [BEAM-14511] Growable Tracker for Go SDK (#17754)

[noreply] [BEAM-14539] Ensure that the print stream can handle larger byte arrays


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 999bceab8e87d25f30faffe7d6431e2d8588663f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 999bceab8e87d25f30faffe7d6431e2d8588663f # timeout=10
Commit message: "[BEAM-14539] Ensure that the print stream can handle larger byte arrays being written and also allow for a growable amount of carry over. (#17787)"
 > git rev-list --no-walk ca33943808c56ce634c92eb85f865285c71ee048 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5595383732220588925.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/65zspmrmx3mtu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #722

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/722/display/redirect?page=changes>

Changes:

[chamikaramj] Adds Java cross-language transforms for invoking Python Map and FlatMap

[noreply] Merge pull request #17683 from [BEAM-14475] add test cases to GcsUtil

[noreply] [BEAM-14410] Add test to demonstrate BEAM-14410 issue in non-cython

[noreply] [BEAM-14449] Support cluster provisioning when using Flink on Dataproc

[noreply] [BEAM-14527] Implement "Beam Summit 2022" banner (#17776)

[noreply] Merge pull request #17222 from [BEAM-12164] Feat: Add new restriction

[noreply] Merge pull request #17598 from [BEAM-14451] Support export to BigQuery

[noreply] Add typing information to RunInferrence. (#17762)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ca33943808c56ce634c92eb85f865285c71ee048 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ca33943808c56ce634c92eb85f865285c71ee048 # timeout=10
Commit message: "Add typing information to RunInferrence. (#17762)"
 > git rev-list --no-walk 31114e893cea46834a7f92451c1c1c2633c8fa40 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4610402335849140739.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tuz4otv2ifwzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #721

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/721/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14255] Drop clock abstraction (#17671)

[noreply] Adds __repr__ to NullableCoder (#17757)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 31114e893cea46834a7f92451c1c1c2633c8fa40 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 31114e893cea46834a7f92451c1c1c2633c8fa40 # timeout=10
Commit message: "Adds __repr__ to NullableCoder (#17757)"
 > git rev-list --no-walk 9a6f7699b5d8daf846221d522d3702c5a4c7b562 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins634117498956676446.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jrmgwnjci3t7w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/720/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14170] - Create a test that runs sickbayed tests (#17471)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a6f7699b5d8daf846221d522d3702c5a4c7b562 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a6f7699b5d8daf846221d522d3702c5a4c7b562 # timeout=10
Commit message: "[BEAM-14170] - Create a test that runs sickbayed tests (#17471)"
 > git rev-list --no-walk 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3062760850238497700.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/exknzpdccimtc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/719/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fb68863779bb6cf082cd91331159e5743bb17d6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
Commit message: "cleaned up TypeScript in coders.ts (#17689)"
 > git rev-list --no-walk 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6255318813257170836.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zpber2gkf5tx6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/718/display/redirect?page=changes>

Changes:

[ilion.beyst] minor: don't capture stderr in kata tests

[Kiley Sok] Update beam-master version for legacy

[Heejong Lee] Fix NonType error when importing google.api_core fails

[noreply] [BEAM-13972] Update documentation for run inference (#17508)

[noreply] [BEAM-14502] Fix: Splitting scans into smaller chunks to buffer reads

[noreply] [BEAM-14218] Add resource location hints to base inference runner.

[noreply] [BEAM-14442] Ask for repro steps/redirect to user list in bug template

[noreply] [BEAM-14166] Push logic in RowWithGetters down into getters and use

[noreply] cleaned up TypeScript in coders.ts (#17689)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fb68863779bb6cf082cd91331159e5743bb17d6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
Commit message: "cleaned up TypeScript in coders.ts (#17689)"
 > git rev-list --no-walk 57f37052067cc690d1515af0cddc604b9c325e11 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4455817510400533953.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wcaav6nfhms5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/717/display/redirect?page=changes>

Changes:

[thiagotnunes] BEAM-14419: Remove invalid mod type

[ihr] [BEAM-14006] Update Python katas to 2.38 and fix issue with one test

[Heejong Lee] [BEAM-14478] Fix missing 'projectId' attribute error

[relax] DLQ for BQ Storage Api writes

[noreply] Bump google.golang.org/api from 0.76.0 to 0.81.0 in /sdks

[noreply] [BEAM-14336] Re-enable `flight_delays_it_test` with

[noreply] [BEAM-11106] small nits to truncate sdf exec unit (#17755)

[noreply] Added standard logging when exception is thrown (#17717)

[noreply] [BEAM-13829] Enable worker status in Go

[noreply] [BEAM-14519] Add website page for Go dependencies (#17766)

[noreply] [BEAM-11106] Validate that DoFn returns Process continuation when

[noreply] [BEAM-14505] Add Dataflow streaming pipeline update support to the Go


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 57f37052067cc690d1515af0cddc604b9c325e11 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 57f37052067cc690d1515af0cddc604b9c325e11 # timeout=10
Commit message: "Merge pull request #17634 from iht/update_python_katas"
 > git rev-list --no-walk c5e521a85f93527b6b3fe20aea505206316ce7ce # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5947605995735129723.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yqmdqgc2xido2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/716/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-14426] Allow skipping of any output when writing an empty

[Robert Bradshaw] Add skip_if_empty attribute to base class to fix test.

[Jan Lukavský] [BEAM-14492] add flinkConfDir to FlinkPipelineOptions

[noreply] Bump cloud.google.com/go/storage from 1.22.0 to 1.22.1 in /sdks

[noreply] [BEAM-14139] Remove unused Flink 1.11 directory (#17750)

[noreply] [BEAM-14044] Allow ModelLoader to forward BatchElements args (#17527)

[noreply] [BEAM-14481] Remove unnecessary context (#17737)

[noreply] [BEAM-9324] Fix incompatibility of direct runner with cython (#17728)

[noreply] [BEAM-14503] Add support for Flink 1.15 (#17739)

[noreply] Update Beam website to release 2.39.0 (#17690)

[noreply] [BEAM-14509] Add several flags to dataflow runner (#17752)

[Yichi Zhang] Fix 2.38.0 download page.

[noreply] [BEAM-14494] Fix publish_docker_images.sh (#17756)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c5e521a85f93527b6b3fe20aea505206316ce7ce (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c5e521a85f93527b6b3fe20aea505206316ce7ce # timeout=10
Commit message: "Merge pull request #17715: [BEAM-14492] add flinkConfDir to FlinkPipelineOptions"
 > git rev-list --no-walk 3e683606d9a03e7da3d37a83eb16c3a6b96068cd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7690669058136473951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dtzk5rphffjk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/715/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14471] Adding testcases and examples for xlang Python

[Heejong Lee] update

[Heejong Lee] add DataframeTransform wrapper

[noreply] [BEAM-14298] resolve dependency

[noreply] Fix -- linting issue (#17738)

[noreply] Fix 'NoneType' object has no attribute error

[noreply] [BEAM-12308] change expected value in kakfa IT (#17740)

[noreply] [BEAM-14053] [CdapIO] Add wrapper class for CDAP plugin (#17150)

[noreply] [BEAM-14129] Clean up PubsubLiteIO by removing options that no longer

[noreply] [BEAM-14496] Ensure that precombine is inheriting one of the timestamps


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3e683606d9a03e7da3d37a83eb16c3a6b96068cd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3e683606d9a03e7da3d37a83eb16c3a6b96068cd # timeout=10
Commit message: "[BEAM-14496] Ensure that precombine is inheriting one of the timestamps output values (#17729)"
 > git rev-list --no-walk acea4027b6dd6726d838eaf50dfb5e1605bdf266 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5882662619506502803.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hoewgbm6atxt2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/714/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14494] Tag rc dockre container with format ${RELEASE}rc${RC_NUM}

[noreply] [BEAM-11578] Fix TypeError in dataflow_metrics has 0 distribution sum

[noreply] [BEAM-14499] Step global, unbounded side input case back to warning

[noreply] [BEAM-14484] Step back unexpected primary handling to warnings (#17724)

[noreply] [BEAM-14486] Document pubsubio & fix its behavior. (#17709)

[noreply] [BEAM-14489] Remove non-SDF version of TextIO. (#17712)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision acea4027b6dd6726d838eaf50dfb5e1605bdf266 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f acea4027b6dd6726d838eaf50dfb5e1605bdf266 # timeout=10
Commit message: "[BEAM-14489] Remove non-SDF version of TextIO. (#17712)"
 > git rev-list --no-walk 1dfab628d03e161cf003dad01f55b9d6674aa8e2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8654532940689401765.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h2y7yiqeaxjn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/713/display/redirect?page=changes>

Changes:

[noreply] Add clarification on Filter transform's input function to pydoc.

[noreply] [BEAM-14367]Flaky timeout in


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1dfab628d03e161cf003dad01f55b9d6674aa8e2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1dfab628d03e161cf003dad01f55b9d6674aa8e2 # timeout=10
Commit message: "[BEAM-14367]Flaky timeout in StatefulDoFnOnDirectRunnerTest.test_dynamic_timer_clear_then_set_timer (#17569)"
 > git rev-list --no-walk 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9125435239829279642.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yxrqhv3w6y2m6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/712/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0c9cf43a7edae2e2a2622a8f4241b64a638121bb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
Commit message: "[BEAM-13015] Only create a TimerBundleTracker if there are timers. (#17445)"
 > git rev-list --no-walk 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6438190557656233332.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bezi24isaxpfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/711/display/redirect?page=changes>

Changes:

[yathu] Add labels for typescript PRs

[noreply] Bump google.golang.org/grpc from 1.45.0 to 1.46.2 in /sdks (#17677)

[noreply] [BEAM-13015] Only create a TimerBundleTracker if there are timers.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0c9cf43a7edae2e2a2622a8f4241b64a638121bb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
Commit message: "[BEAM-13015] Only create a TimerBundleTracker if there are timers. (#17445)"
 > git rev-list --no-walk 301acc825a808ae1d62f5115601a7d81b2514e7d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2507591418752031365.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xpmfqm2o4qqlc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/710/display/redirect?page=changes>

Changes:

[chamikaramj] Corrects I/O connectors availability status in Beam Website.

[singh.vikash2310] fixed typos in README.md

[noreply] Update the PTransform and associated APIs to be less class-based.

[noreply] Vortex performance improvement: Enable multiple stream clients per

[noreply] [BEAM-14488] Alias async flags. (#17711)

[noreply] [BEAM-14487] Make drain & update terminal states. (#17710)

[noreply] [BEAM-14484] Improve behavior surrounding primary roots in

[noreply] Improve validation error message (#17719)

[noreply] Remove unused validation configurations. (#17705)

[bulat.safiullin] [BEAM-14418] added arrows to slider

[noreply] Minor: Bump Dataflow container versions (#17684)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 301acc825a808ae1d62f5115601a7d81b2514e7d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 301acc825a808ae1d62f5115601a7d81b2514e7d # timeout=10
Commit message: "Merge pull request #17722: [BEAM-14418] added arrows to slider"
 > git rev-list --no-walk 212d63d291b0c4cbc685c320ea5b8768b9234b64 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins893183476287720211.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hepz45lutjlci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/709/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-14428] change text, change styling of connectors and contribute

[noreply] [BEAM-10529] update KafkaIO Xlang integration test to publish and

[noreply] Fix a few small linting bugs (#17695)

[noreply] Bump github.com/lib/pq from 1.10.5 to 1.10.6 in /sdks (#17691)

[noreply] Update release-guide.md


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 212d63d291b0c4cbc685c320ea5b8768b9234b64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 212d63d291b0c4cbc685c320ea5b8768b9234b64 # timeout=10
Commit message: "Merge pull request #17572: [BEAM-14428] I/O, community, and contribute pages improvements"
 > git rev-list --no-walk 857f8d300d942177ebc4244b9b405222d7deb26d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9155362939050882918.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 19s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gngmgv3yi4gb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/708/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12000] Update programming-guide.md (#17679)

[noreply] [BEAM-14467] Fix bug where run_pytest.sh does not elevate errors raised

[noreply] [BEAM-14474] Suppress 'Mean of empty slice' Runtime Warning in dataframe


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 857f8d300d942177ebc4244b9b405222d7deb26d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 857f8d300d942177ebc4244b9b405222d7deb26d # timeout=10
Commit message: "[BEAM-14474] Suppress 'Mean of empty slice' Runtime Warning in dataframe unit test (#17682)"
 > git rev-list --no-walk a37d324791b5e67d1b78c7e9cc0aaa5653b42826 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4468541818999584081.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518150711 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ih3ixp7aoy3iw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/707/display/redirect?page=changes>

Changes:

[mmack] [BEAM-14334] Remove remaining forkEvery 1 from all Spark tests and stop

[noreply] [BEAM-14473] Throw error if using globally windowed, unbounded side

[noreply] [BEAM-14440] Add basic fuzz tests to the coders package (#17587)

[noreply] [BEAM-14035 ] Implement BigQuerySchema Read/Write TransformProvider

[noreply] Add Akvelon to case-studies (#17611)

[noreply] Merge pull request #17520 from BEAM-12356 Close DatasetService leaked

[noreply] Adding eslint and lint configuration to TypeScript SDK (#17676)

[noreply] Update release-guide.md

[noreply] Update release-guide.md

[noreply] [BEAM-14411] Re-enable TypecodersTest, fix most issues (#17547)

[noreply] Merge pull request #17678 from [BEAM-14460] [Playground] WIP. Fix error

[Alexey Romanenko] [BEAM-14035] Fix checkstyle issue

[noreply] [BEAM-14441] Automatically assign issue labels based on responses to

[noreply] README update for the Docker Error 255 during Website launch on Apple


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a37d324791b5e67d1b78c7e9cc0aaa5653b42826 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a37d324791b5e67d1b78c7e9cc0aaa5653b42826 # timeout=10
Commit message: "README update for the Docker Error 255 during Website launch on Apple Silicon (#17456)"
 > git rev-list --no-walk e6aab063e09ba52703e0417221de4c4466f8fd13 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins127776825587521914.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0517153729 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qav6w4pspp6lw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/706/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Update the SDK harness grouping table to be memory bounded

[noreply] [BEAM-13982] Added output of logging for python E2E pytests (#17637)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e6aab063e09ba52703e0417221de4c4466f8fd13 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e6aab063e09ba52703e0417221de4c4466f8fd13 # timeout=10
Commit message: "[BEAM-13982] Added output of logging for python E2E pytests (#17637)"
 > git rev-list --no-walk 5064cc247ba3ec2697cd7493b14cef8567d614f6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5221359538103121011.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0516150708 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/f2jk6gpsu3vae

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/705/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14470] Use Generic Registrations in loadtests. (#17673)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5064cc247ba3ec2697cd7493b14cef8567d614f6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5064cc247ba3ec2697cd7493b14cef8567d614f6 # timeout=10
Commit message: "[BEAM-14470] Use Generic Registrations in loadtests. (#17673)"
 > git rev-list --no-walk 780ad62d42f8216ba030e97c203fc2310cd041b0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2403426245663029742.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0515150637 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zank6nzhhask6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/704/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14455] Add UUID to sub-schemas for PythonExternalTransform

[Heejong Lee] [BEAM-14430] Adding a logical type support for Python callables to Row

[Heejong Lee] add urn, type inference for PythonCallableSource

[Heejong Lee] fix lint errors

[Heejong Lee] move logical types def

[Heejong Lee] add micros_instant urn

[Heejong Lee] put a default type hint for PythonCallableSource

[Heejong Lee] add comment

[noreply] Revert "Better test assertion. (#17551)"

[noreply] Bump github.com/spf13/cobra from 1.3.0 to 1.4.0 in /sdks (#17647)

[noreply] [BEAM-14465] Reduce DefaultS3ClientBuilderFactory logging to debug level

[noreply] Merge pull request #17365 from [BEAM-12482] Update Schema Destination

[noreply] [BEAM-14014] Support impersonation credentials in dataflow runner

[noreply] [BEAM-14469] Allow nil primary returns from TrySplit in  a single-window

[noreply] Add some auto-starting runners to the typescript SDK. (#17580)

[noreply] [BEAM-14371] (and BEAM-14372) - enable a couple staticchecks (#17670)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 780ad62d42f8216ba030e97c203fc2310cd041b0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 780ad62d42f8216ba030e97c203fc2310cd041b0 # timeout=10
Commit message: "[BEAM-14371] (and BEAM-14372) - enable a couple staticchecks (#17670)"
 > git rev-list --no-walk 787479f1a5e178333ded3ff02331163c4fe75f1a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2338174919279329659.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0514150554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4y2otid5yulri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/703/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-14441] Add GitHub issue templates

[dannymccormick] Ask for beam version + other dependencies

[dannymccormick] We don't need outage

[dannymccormick] Cut p4

[chamikaramj] Updates CHANGES.md to include some recently discovered known issues

[dannymccormick] Pare down to fewer templates

[noreply] Revert "[BEAM-14429] Force java load test on dataflow runner v2

[noreply] [BEAM-14347] Add generic registration feature to CHANGES (#17643)

[noreply] Better test assertion. (#17551)

[noreply] Bump github.com/google/go-cmp from 0.5.7 to 0.5.8 in /sdks (#17628)

[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#17627)

[noreply] Bump github.com/lib/pq from 1.10.4 to 1.10.5 in /sdks (#17626)

[noreply] Merge pull request #17584 from [BEAM-14415] Exception handling tests and

[noreply] Bump cloud.google.com/go/pubsub from 1.18.0 to 1.21.1 in /sdks (#17646)

[noreply] Merge pull request #17408 from [BEAM-14312] [Website] change section

[noreply] Bump cloud.google.com/go/bigquery from 1.28.0 to 1.32.0 in /sdks

[noreply] [BEAM-14347] Add function for simple function registration (#17650)

[noreply] Drop dataclasses requirement, we only support python 3.7+ (#17640)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 787479f1a5e178333ded3ff02331163c4fe75f1a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 787479f1a5e178333ded3ff02331163c4fe75f1a # timeout=10
Commit message: "Drop dataclasses requirement, we only support python 3.7+ (#17640)"
 > git rev-list --no-walk fd61a90057011270dbf9a36c73b5baaf120100e2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8937874220063793751.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0513150610 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q65m3ko6p6ble

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/702/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-14096] bump junit-quickcheck to 1.0

[noreply] [BEAM-11104] Add self-checkpointing to CHANGES.md (#17612)

[noreply] [BEAM-14081] [CdapIO] Add context classes for CDAP plugins (#17104)

[noreply] [BEAM-12526] Add Dependabot (#17563)

[noreply] Remove python 3.6 postcommit from mass_comment.py (#17630)

[noreply] [BEAM-14347] Add some benchmarks for generic registration (#17613)

[noreply] Correctly route go dependency changes to go label (#17632)

[noreply] [BEAM-13695] Add jamm jvm options to Java 11 (#17178)

[noreply] [BEAM-14334] Fix leakage of SparkContext in Spark runner tests to remove

[noreply] Typo & link update (#17633)

[noreply] Trigger go precommits on go mod/sum changes (#17636)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fd61a90057011270dbf9a36c73b5baaf120100e2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fd61a90057011270dbf9a36c73b5baaf120100e2 # timeout=10
Commit message: "Trigger go precommits on go mod/sum changes (#17636)"
 > git rev-list --no-walk 0f38c82007bee45c375ec75a5c7af2c672483a19 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2192723667681767660.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0512150625 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/epfzqhm7h7jbm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/701/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-14448] add datastore test

[yathu] [BEAM-14423] Add test cases for BigtableIO.BigtableWriterFn fails due to

[Pablo Estrada] Revert "Merge pull request #17517 from [BEAM-14383] Improve "FailedRows"

[noreply] [BEAM-14229] Fix SyntheticUnboundedSource duplication from checkpoint

[noreply] [BEAM-14347] Rename registration package to register (#17603)

[noreply] [BEAM-11104] Add self-checkpointing integration test (#17590)

[noreply] [BEAM-5492] Python Dataflow integration tests should export the pipeline

[noreply] [BEAM-14396] Bump httplib2 upper bound. (#17602)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0f38c82007bee45c375ec75a5c7af2c672483a19 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0f38c82007bee45c375ec75a5c7af2c672483a19 # timeout=10
Commit message: "[BEAM-14396] Bump httplib2 upper bound. (#17602)"
 > git rev-list --no-walk 5c21fbccec5e1e831dd0040bd7f631c050865430 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4770161501794335272.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0511150544 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dkbqn64q3jm6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/700/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Spark job

[noreply] Merge pull request #17559 from [BEAM-14423] Add exception injection

[noreply] [BEAM-11104] Allow self-checkpointing SDFs to return without finishing

[noreply] Merge pull request #17544 from [BEAM-14415] Exception handling tests for

[noreply] Merge pull request #17565 from [BEAM-14413] add Kafka exception test

[noreply] Merge pull request #17555 from [BEAM-14417] Adding exception handling

[noreply] [BEAM-14433] Improve Go split error message. (#17575)

[noreply] [BEAM-14429] Force java load test on dataflow runner v2

[noreply] Merge pull request #17577 from [BEAM-14435] Adding exception handling

[noreply] [BEAM-14347] Add generic registration functions for iters and emitters

[noreply] [BEAM-14169] Add Credentials rotation cron job for clusters (#17383)

[noreply] [BEAM-14347] Add generic registration for accumulators (#17579)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5c21fbccec5e1e831dd0040bd7f631c050865430 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5c21fbccec5e1e831dd0040bd7f631c050865430 # timeout=10
Commit message: "Merge pull request #15679 from aromanenko-dev/BEAM-12918-tpcds-jenkins"
 > git rev-list --no-walk 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7381207504597737380.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0510150620 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p7sy5ylo5vcxq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/699/display/redirect?page=changes>

Changes:

[elias.segundo] Changing elegibility to AllNodeElegibility

[chamikaramj] Adds code reviewers for GCP I/O connectors and KafkaIO to Beam OWNERS

[andyye333] Add extra details to PubSub matcher errors


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 # timeout=10
Commit message: " [BEAM-14439] [BEAM-12673] Add extra details to PubSub matcher errors #17586"
 > git rev-list --no-walk 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins865063587422456133.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0509150552 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dcztjjog4ohqa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/698/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 70b7567de56af29745d98d5d24d2e2427045dd9d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
Commit message: "Merge pull request #17482 from ihji/BEAM-14374"
 > git rev-list --no-walk 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9097543835111371904.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0508150546 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/37jwps7v2az3i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/697/display/redirect?page=changes>

Changes:

[kevinsijo] Setting up a basic directory

[kevinsijo] Mirroring Python SDK's directory structure

[kerrydc] Adds initial tests

[kevinsijo] 'runners' is the correct directory name

[Pablo Estrada] sketching the core API for JS SDK

[jonathanlui] add .gitignore for node/ts project

[Robert Bradshaw] Worker directory.

[Robert Bradshaw] Fix complile errors with explicit any for callables.

[Robert Bradshaw] Add worker entry point.

[Robert Bradshaw] Add proto generation code.

[Robert Bradshaw] Add generated proto files.

[Robert Bradshaw] Attempts to get ts protos to compile.

[Robert Bradshaw] Exclude ts protos for now.

[Robert Bradshaw] More changes to get ts protos working.

[Robert Bradshaw] Update scripts and config to get protos compiling.

[Robert Bradshaw] Update geenrated files.

[jonathanlui] add build and clean script to compile ts

[Robert Bradshaw] Generate server for loopback worker.

[Robert Bradshaw] Generated grpc servers for loopback.

[Robert Bradshaw] Add typescript formatter.

[Robert Bradshaw] Loopback server (that does nothing).

[Robert Bradshaw] Working server.

[Pablo Estrada] Starting expansion of primitive transforms

[Pablo Estrada] Starting to implement and support standard coders

[Robert Bradshaw] Also generate grpc clients.

[Robert Bradshaw] Basic implementation of worker harness.

[Pablo Estrada] fix the build

[Robert Bradshaw] Add some missing files for worker harness.

[Robert Bradshaw] Refactor operators to use registration.

[jonathanlui] enable ts in mocha

[jonathanlui] update readme

[jonathanlui] --save-dev @types/mocha

[jonathanlui] translate core_test.js to typescript

[Robert Bradshaw] Encapsulate worker service in a class.

[Kenneth Knowles] Port standard_coders_test to typescript (superficially)

[Pablo Estrada] Starting the proto translation of Impulse, ParDo, GBK

[Robert Bradshaw] Add some tests for the worker code.

[Robert Bradshaw] Fixing old lock file error.

[Pablo Estrada] Adding transform names and fixing GBK coder issue

[Robert Bradshaw] npx tsfmt -r src/apache_beam/base.ts src/apache_beam/transforms/core.ts

[Kenneth Knowles] switch to import style require() statements

[Kenneth Knowles] Add Coder interface using protobufjs classes

[Kenneth Knowles] BytesCoder with some failures

[noreply] Added GeneralObjectCoder and using it as coder for most transforms (#9)

[Kenneth Knowles] Fix order of arguments to deepEqual

[Kenneth Knowles] Encode expected encoding as binary

[Robert Bradshaw] Refactor API to allow for composites.

[jrmccluskey] Initial setup for automated Java expansion startup

[jrmccluskey] Update exp_service.ts

[Kenneth Knowles] Fix up coder deserialization

[Robert Bradshaw] Simplify GBK coder computation.

[Robert Bradshaw] Remove top-level PValue.

[Pablo Estrada] Make tests green

[Robert Bradshaw] Rename PValueish to PValue.

[jonathanlui] node runner

[jonathanlui] whitespaces

[Robert Bradshaw] Make Runner.run async.

[jonathanlui] bson and fast-deep-equal should not be listed as devdependency

[jrmccluskey] Add basic Dockerfile that starts ExternalWorkerPool

[Robert Bradshaw] Direct runner.

[kevinsijo] Testing expansion service communication

[Robert Bradshaw] Added flatten, assertion checkers.

[Pablo Estrada] progress on basic coders

[Robert Bradshaw] Fixing the build.

[Robert Bradshaw] Cleanup, simplify access.

[Pablo Estrada] Adding limited support for KVCoder and IterableCoder

[Robert Bradshaw] Introduce PipelineContext.

[Robert Bradshaw] Add toProto to all coders.

[Robert Bradshaw] Some work with coders.

[Robert Bradshaw] Remove debug logging.

[Robert Bradshaw] Use coders over data channel.

[Kenneth Knowles] explicitly sequence sub-coder serializations

[Kenneth Knowles] no more need to extend FakeCoder

[Kenneth Knowles] actually advance reader

[Kenneth Knowles] autoformat

[Kenneth Knowles] protobufjs already can write and read signed varints

[Kenneth Knowles] with improved test harness, kv has many more failures

[Kenneth Knowles] read bytescoder from correct position

[Kenneth Knowles] no more fake coders

[Kenneth Knowles] varint examples all work

[Kenneth Knowles] simplify coder value parsing

[Kenneth Knowles] global window coder

[Kenneth Knowles] fix swapEndian32

[Robert Bradshaw] Add P(...) operator.

[kevinsijo] Implementing RowCoder encoding.

[jrmccluskey] remove unused container dir

[kevinsijo] Corrected sorting of encoded positions to reflect an argsort instead.

[Robert Bradshaw] Populate environments.

[kevinsijo] Implementing RowCoder decoding.

[Kenneth Knowles] preliminary unbounded iterable coder

[Kenneth Knowles] friendlier description of standard coder test case

[Kenneth Knowles] fix test harness; iterable works

[jrmccluskey] first pass at boot.go

[jonathanlui] update package-lock.json

[jonathanlui] make NodeRunner a subclass of Runner

[jonathanlui] add waitUntilFinish interface member

[Pablo Estrada] Adding double coder

[Kenneth Knowles] scaffolding for windowed values

[Pablo Estrada] Adding type information to PColleciton and PTransform

[jonathanlui] fix direct runner

[Pablo Estrada] Adding typing information for DoFns

[Kenneth Knowles] add interval window

[Robert Bradshaw] Export PValue.

[Robert Bradshaw] Add CombineFn interface.

[Robert Bradshaw] Typed flatten.

[jonathanlui] add runAsync method to base.Runner

[Kenneth Knowles] add Long package

[Pablo Estrada] Adding more types. Making PValue typed

[Kenneth Knowles] instant coder draft

[Robert Bradshaw] Return job state from direct runner.

[Kenneth Knowles] type instant = long

[jonathanlui] implement NodeRunner.runPipeline

[Kenneth Knowles] autoformat

[kevinsijo] Completed implementation of basic row coder

[Kenneth Knowles] Fix IntervalWindowCoder, almost

[Kenneth Knowles] fix interval window coder

[Kenneth Knowles] autoformat

[Robert Bradshaw] loopback runner works

[Kenneth Knowles] move core element types into values.ts

[Kenneth Knowles] just build object directly to be cool

[Robert Bradshaw] GBK working on ULR.

[Robert Bradshaw] Async transforms.

[Robert Bradshaw] External transform grpah splicing.

[Kenneth Knowles] progress on windowed value: paneinfo encoding

[Robert Bradshaw] Fix merge.

[Robert Bradshaw] autoformat

[Kenneth Knowles] full windowed value coder

[kerrydc] Updates tests to use correct types, adds generics where needed to DoFns

[Robert Bradshaw] Add serialization librarires.'

[Robert Bradshaw] Add Split() PTransform, for producing multiple outputs from a single

[Robert Bradshaw] Schema-encoded external payloads.

[kevinsijo] Adding Schema inference from JSON

[Pablo Estrada] Removing unused directories

[Pablo Estrada] Support for finishBundle and improving typing annotations.

[Pablo Estrada] A base implementation of combiners with GBK/ParDo

[Robert Bradshaw] Fully propagate windowing information in both remote and direct runner.

[Robert Bradshaw] Make args and kwargs optional for python external transform.

[Robert Bradshaw] Infer schema for external transforms.

[Pablo Estrada] Implementing a custom combine fn as an example. Small fixes

[Robert Bradshaw] Fix missing windowing information in combiners.

[Robert Bradshaw] PostShuffle needn't group by key as that's already done.

[Robert Bradshaw] Guard pre-combine for global window only.

[Robert Bradshaw] WindowInto

[Robert Bradshaw] Fix optional kwargs.

[Robert Bradshaw] A couple of tweaks for js + py

[Robert Bradshaw] Add windowing file.

[Robert Bradshaw] CombineBy transform, stand-alone WordCount.

[Robert Bradshaw] cleanup

[Robert Bradshaw] Actually fix optional external kwargs.

[Robert Bradshaw] Demo2, textio read.

[Robert Bradshaw] Add command lines for starting up the servers.

[Robert Bradshaw] Run prettier on the full codebase.

[Robert Bradshaw] Update deps.

[Pablo Estrada] Adding docstrings for core.ts. Prettier dependency

[Pablo Estrada] Documenting coder interfaces

[Pablo Estrada] Added documentation for a few standard coders

[Robert Bradshaw] Unified grouping and combining.

[Robert Bradshaw] Allow PCollection ids to be lazy.

[Robert Bradshaw] Reorganize module structure.

[Robert Bradshaw] A couple more renames.

[Robert Bradshaw] Simplify.

[Robert Bradshaw] Consolidation.

[Robert Bradshaw] Fix build.

[Robert Bradshaw] Add optional context to ParDo.

[Robert Bradshaw] fixup: iterable coder endian sign issue

[Robert Bradshaw] omit context for map(console.log)

[Robert Bradshaw] Fix ReadFromText coders.

[Robert Bradshaw] Flesh out README with overview and current state.

[noreply] Readme typo

[Robert Bradshaw] Two more TODOs.

[noreply] Add a pointer to the example wordcount to the readme.

[Pablo Estrada] Documenting coders and implementing unknown-length method

[Robert Bradshaw] UIID dependency.

[Robert Bradshaw] Artifact handling.

[Robert Bradshaw] Properly wait on data channel for bundle completion.

[Robert Bradshaw] Automatic java expansion service startup.

[Robert Bradshaw] Process promises.

[Robert Bradshaw] Implement side inputs.

[Robert Bradshaw] Cleanup.

[Robert Bradshaw] Put complex constext stuff in its own file.

[Robert Bradshaw] Rename BoundedWindow to just Window.

[Robert Bradshaw] Alternative splitter class.

[Pablo Estrada] Documenting internal functions

[Robert Bradshaw] Take a pass clarifying the TODOs.

[Robert Bradshaw] Sql transform wrapper.

[Robert Bradshaw] Incorporate some feedback into the TODOs.

[Robert Bradshaw] More TODOs.

[Robert Bradshaw] Remove app placeholder.

[Robert Bradshaw] Apache license headers.

[Robert Bradshaw] More TODOs

[jankuehle] Suggestions for TypeScript todos

[dannymccormick] Add actions for typescript sdk

[dannymccormick] Fix test command

[noreply] Add missing version

[dannymccormick] Fix codecovTest command

[noreply] Only do prettier check on linux

[noreply] Only get codecov on linux

[Robert Bradshaw] Resolve some comments.

[Robert Bradshaw] Fix compile errors.

[Robert Bradshaw] Prettier.

[Robert Bradshaw] Re-order expandInternal arguments pending unification.

[Robert Bradshaw] More consistent and stricter PTransform naming.

[Robert Bradshaw] Notes on explicit, if less idiomatic, use of classes.

[Robert Bradshaw] Let DoFn be an interface rather than a class.

[Robert Bradshaw] Provide DoFn context to start and finish bundle.

[Robert Bradshaw] Optional promise code simplification.

[Robert Bradshaw] Cleanup todos.

[Robert Bradshaw] Avoid any type where not needed.

[Robert Bradshaw] Apache RAT excludes for typescript.

[Robert Bradshaw] Remove empty READMEs.

[Robert Bradshaw] Add licences statement to readme files.

[Robert Bradshaw] More RAT fixes.

[Robert Bradshaw] Another unsupported coder.

[Robert Bradshaw] Remove debugging code.

[noreply] Fix automatic naming with code coverage.

[Robert Bradshaw] Coders cleanup.

[Robert Bradshaw] Add tests for RowCoder.

[Robert Bradshaw] Normalize capitalization, comments.

[Robert Bradshaw] Install typescript closure packages.

[Robert Bradshaw] npm audit fix

[Robert Bradshaw] Move more imports out of base.

[Robert Bradshaw] Changes needed to compile with ts closure plugin.

[Robert Bradshaw] Use ttsc and ts-closure-transform plugin.

[Robert Bradshaw] Serialization registration to actually get serialization working.

[Robert Bradshaw] Container images working on local runner.

[Robert Bradshaw] Add a portable job server that proxies the Dataflow backend. (#17189)

[Robert Bradshaw] Improvements to dataflow job service for non-Python jobs.

[Robert Bradshaw] Get dataflow working.

[Robert Bradshaw] User friendly pipeline options.

[Robert Bradshaw] Less classes, more functions.

[Robert Bradshaw] Add new nullable standard coder.

[Robert Bradshaw] Make Apache Rat happy.

[Robert Bradshaw] Disable broken codecov.

[Robert Bradshaw] Remove last uses of base.ts.

[Robert Bradshaw] Remove unneedd file.

[Robert Bradshaw] Remove more uneeded/unused files.

[Robert Bradshaw] Cleanup tests.

[Robert Bradshaw] Minor cleanups to coder tests.

[noreply] Quote pip install package name

[noreply] [BEAM-14374] Fix module import error in FullyQualifiedNamedTransform

[Robert Bradshaw] Addressing issues from the review.

[noreply] Apply suggestions from code review.

[Robert Bradshaw] Post-merge fixes.

[dannymccormick] Delete tags.go

[Robert Bradshaw] Update tests to use our actual serialization libraries.

[Robert Bradshaw] Another pass at TODOs, removing finished items.

[Heejong Lee] [BEAM-14146] Python Streaming job failing to drain with BigQueryIO write

[Heejong Lee] add test

[noreply] Merge pull request #17490 from [BEAM-14370] [Website] Add new page about

[noreply] [BEAM-14332] Refactored cluster management for Flink on Dataproc

[noreply] [BEAM-13988] Update mtime to use time.UnixMilli() calls (#17578)

[noreply] Fixing patching error on missing dependencies (#17564)

[noreply] Merge pull request #17517 from [BEAM-14383] Improve "FailedRows" errors

[Heejong Lee] add test without mock


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 70b7567de56af29745d98d5d24d2e2427045dd9d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
Commit message: "Merge pull request #17482 from ihji/BEAM-14374"
 > git rev-list --no-walk 2af0dc79912011e46b297c2b8091a2ee0a191510 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8088870281924289066.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0507150554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uj5outmg4rhca

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/696/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14173] Fix Go Loadtests on Dataflow & partial fix for Flink

[noreply] Upgrade python sdk container requirements. (#17549)

[noreply] Merge pull request #17497: [BEAM-11205] Update GCP Libraries BOM version

[noreply] [BEAM-12603] Add retry on grpc data channel and remove retry from test.

[noreply] Merge pull request #17359: [BEAM-14303] Add a way to exclude output

[Kenneth Knowles] Add parameter for service account impersonation in GCP credentials

[noreply] [BEAM-14347] Allow users to optimize DoFn execution with a single

[noreply] [BEAM-5878] Add (failing) kwonly-argument test (#17509)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2af0dc79912011e46b297c2b8091a2ee0a191510 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2af0dc79912011e46b297c2b8091a2ee0a191510 # timeout=10
Commit message: "Merge pull request #17394: [BEAM-14014] Add parameter for service account impersonation in GCP credentials"
 > git rev-list --no-walk 017f846ca342745cc1043c45b9ff25f6561d8dc0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3989664279962806608.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0506150607 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ycrpouqvvdrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/695/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-9245] Unable to pull datatore Entity which contains dict

[bulat.safiullin] [BEAM-14382] [Website] add banner container for with css, images, html

[Jan Lukavský] [BEAM-14196] add test verifying output watermark propagation in bundle

[Jan Lukavský] [BEAM-14196] Fix FlinkRunner mid-bundle output watermark handling

[bulat.safiullin] [BEAM-14382] change mobile banner img, add padding to banner section

[ahmedabualsaud] fix test decotrator typo

[noreply] Merge pull request #17440 from [BEAM-14329] Enable exponential backoff

[noreply] [BEAM-11104] Fix output forwarding issue for ProcessContinuations

[noreply] re-add testing package to pydoc (#17524)

[Heejong Lee] add test

[noreply] [BEAM-14250] Amended the workaround (#17531)

[noreply] [BEAM-11104] Fix broken split result validation (#17546)

[noreply] Fixed a SQL and screenshots in the Beam SQL blog (#17545)

[noreply] Merge pull request #17417: [BEAM-14388] Address some performance

[noreply] [BEAM-14386] [Flink] Support for scala 2.12 (#17512)

[noreply] [BEAM-14294] Worker changes to support trivial Batched DoFns (#17384)

[zyichi] Moving to 2.40.0-SNAPSHOT on master branch.

[zyichi] Move master readme.md to 2.40.0

[noreply] [BEAM-14048] [CdapIO] Add ConfigWrapper for building CDAP PluginConfigs


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 017f846ca342745cc1043c45b9ff25f6561d8dc0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 017f846ca342745cc1043c45b9ff25f6561d8dc0 # timeout=10
Commit message: "Merge pull request #17552 from y1chi/update_md"
 > git rev-list --no-walk 43d488c55cd25290c6f560f6649597fcc00dcc42 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins21008417301839079.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0505150604 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...

Publishing failed.

The build scan server appears to be unavailable.
Please check https://status.gradle.com for the latest service status.

If the service is reported as available, please report this problem via https://gradle.com/help/plugin and include the following via copy/paste:

----------
Gradle version: 7.4
Plugin version: 3.4.1
Request URL: https://status.gradle.com
Request ID: 77b509a8-bc17-4e20-8caf-d6cd3bf2df71
Response status code: 405
Response server type: Varnish
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/694/display/redirect?page=changes>

Changes:

[noreply] fix: JDBC config schema fields order

[Brian Hulette] Revert "Merge pull request #17255 from kileys/test-revert"

[Brian Hulette] BEAM-14231: bypass schema cache for

[noreply] [BEAM-13657] Follow up update version warning in __init__ (#17493)

[noreply] Merge pull request #17431 from [BEAM-14273] Add integration tests for BQ

[noreply] Merge pull request #17205 from [BEAM-14145] [Website] add carousel to

[noreply] [BEAM-14064] fix es io windowing (#17112)

[noreply] [BEAM-13670] Upgraded ipython from v7 to v8 (#17529)

[noreply] [BEAM-11104] Enable ProcessContinuation return values, add unit test

[Robert Bradshaw] [BEAM-14403] Allow Prime to be used with legacy workers.

[noreply] [BEAM-11106] Support drain in Go SDK (#17432)

[noreply] add __Init__ to inference. (#17514)

[nielm] [BEAM-14405] Fix NPE when ProjectID is not specified in a template


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 43d488c55cd25290c6f560f6649597fcc00dcc42 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 43d488c55cd25290c6f560f6649597fcc00dcc42 # timeout=10
Commit message: "Merge pull request #17540: [BEAM-14405] Fix NPE when ProjectID is not specified in a template execution"
 > git rev-list --no-walk 0daef62a7bd993b13064de80588e343ee764e004 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6083874758662682046.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nrijpmjypb46y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/693/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Allow arithmetic between deferred scalars.

[noreply] [BEAM-8688] Upgrade GCSIO to 2.2.6 (#17486)

[noreply] [BEAM-14253] patch SubscriptionPartitionLoader to work around a dataflow

[noreply] Add website link log to notify user of pre-build workflow. (#17498)

[noreply] [BEAM-11105] Add timestamp observing watermark estimation (#17476)

[noreply] Merge pull request #17487 from Adding user-agent to GCS client in Python

[noreply] [BEAM-10265] Display error message if trying to infer recursive schema

[noreply] [BEAM-12575] Upgraded ipykernel from v5 to v6 (#17526)

[noreply] [BEAM-11105] Add docs + CHANGES.md entry for Go Watermark Estimation

[noreply] Merge pull request #17380 from [BEAM-14314][BEAM-9532] Add last_updated


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0daef62a7bd993b13064de80588e343ee764e004 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0daef62a7bd993b13064de80588e343ee764e004 # timeout=10
Commit message: "Merge pull request #17380 from [BEAM-14314][BEAM-9532] Add last_updated field in filesystem.FileMetaData"
 > git rev-list --no-walk e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins393121336740370924.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uzn5ftj3dq7oa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/692/display/redirect?page=changes>

Changes:

[yathu] [BEAM-14375] Fix Java Wordcount Dataflow postcommit

[noreply] [BEAM-11105] Add manual watermark estimation (#17475)

[noreply] [BEAM-14390] Set user-agent when pulling licenses to avoid 403s (#17521)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 # timeout=10
Commit message: "Merge pull request #1748: [BEAM-14375] Fix Java Wordcount Dataflow postcommit for Gradle 7.4"
 > git rev-list --no-walk 4b413bbb5f8807b0f7a284fd818f2772f036fe55 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8918485093624915538.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins
> Task :buildSrc:check
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 26s
10 actionable tasks: 8 executed, 1 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/u7aitczvohw7y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/691/display/redirect?page=changes>

Changes:

[noreply] Revert "Improvement to Seed job configuration to launch against PRs

[ilion.beyst] Minor: fix typo

[noreply] Merge pull request #17422 from [BEAM-14344]: remove tracing from


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b413bbb5f8807b0f7a284fd818f2772f036fe55 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b413bbb5f8807b0f7a284fd818f2772f036fe55 # timeout=10
Commit message: "Merge pull request #17515 from [BEAM-14377] Revert "Improvement to Seed job configuration so we can launch seed jobs against PRs""
 > git rev-list --no-walk 58b4d762eece66774a5df6ca54e6f91c49057c9b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins902256721302274975.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rfzrsgnwt37xg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/690/display/redirect?page=changes>

Changes:

[noreply] Revert "Merge pull request #17260 from [BEAM-13229] [Website] bug side

[noreply] [BEAM-14001] Add missing test cases to existing suites in exec package

[noreply] [BEAM-14243] Add staticcheck to Github Actions Precommits (#17479)

[noreply] [BEAM-14368][BEAM-13984]Change model loading from constructor to

[noreply] [BEAM-13983] changed file name from sklearn_loader to sklearn_inference

[noreply] Add SQL in Notebooks blog post (#17481)

[noreply] Merge pull request #17404: [BEAM-13990] support date and timestamp


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 58b4d762eece66774a5df6ca54e6f91c49057c9b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 58b4d762eece66774a5df6ca54e6f91c49057c9b # timeout=10
Commit message: "Merge pull request #17404: [BEAM-13990] support date and timestamp fields"
 > git rev-list --no-walk 8c4a056a63d92776ae9d6be726b37d789486afbd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5886884703813794584.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ze4q7n73kh6lu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/689/display/redirect?page=changes>

Changes:

[ihr] Update Java katas to Beam 2.38

[Robert Bradshaw] Add element weighting parameter to BatchElements.

[noreply] [BEAM-14369] Fix "target/options: no such file or directory" error while

[noreply] [BEAM-14297] Enable nullable key and value arrays for xlang kafka io

[noreply] Merge pull request #17444 from [BEAM-14310] [Website] bug home

[noreply] Merge pull request #17388 from [BEAM-14311] [Website] Home Page

[noreply] [BEAM-14376] Typo in method description doc

[noreply] Add default classpath when not present (#17491)

[Robert Bradshaw] Clearer test.

[thiagotnunes] fix: update javadocs for ChangeStreamMetrics

[noreply] Merge pull request #17443 from [BEAM-12164]: use the end timestamp for

[noreply] Merge pull request #17260 from [BEAM-13229] [Website] bug side nav

[noreply] [BEAM-14351] Fix the template and move the announcement to the next


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8c4a056a63d92776ae9d6be726b37d789486afbd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8c4a056a63d92776ae9d6be726b37d789486afbd # timeout=10
Commit message: "Merge pull request #17465 Add element weighting parameter to BatchElements."
 > git rev-list --no-walk b0e6b561683425fe865720970ce60d45ecec11e4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4896086193213808317.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hdj4e3lwdyiis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/688/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #17226 from [BEAM-14204] [Playground] Tests for

[noreply] [BEAM-13015, BEAM-14184] Address unbounded number of messages being

[noreply] Improvement to Seed job configuration to launch against PRs (#17468)

[noreply] [BEAM-13983] Small changes to sklearn runinference (#17459)

[chamikaramj] Renames ExternalPythonTransform to PythonExternalTransform

[noreply] [BEAM-14351] Inherit from Coder. (#17437)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b0e6b561683425fe865720970ce60d45ecec11e4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b0e6b561683425fe865720970ce60d45ecec11e4 # timeout=10
Commit message: "[BEAM-14351] Inherit from Coder. (#17437)"
 > git rev-list --no-walk bb5342507e77b040f5bb402aa3628a180f7bf71e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7465072747978766080.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v2aufztan3kae

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/687/display/redirect?page=changes>

Changes:

[msbukal] FhirIO: use .search() or .searchType instead of .setResourceType()

[nick.caballero] [BEAM-14363] Fixes WatermarkParameters builder for Kinesis

[noreply] Remove unnecessary decorator from RunInference interface (#17463)

[noreply] [BEAM-13590] Minor deprecated warning fix (#17453)

[noreply] [BEAM-12164]: fix the negative throughput issue (#17461)

[noreply] Updated goldens for the screen diff integration tests (#17467)

[noreply] fixes copy by value error for bytes.Buffer in Error (#17469)

[noreply] Merge pull request #17354 from [BEAM-14170] - Create a test that runs

[noreply] Merge pull request #17447 from [BEAM-14357] Fix

[noreply] [BEAM-14324, BEAM-14325] Staticcheck cleanup in test files (#17393)

[noreply] BEAM-14187 Fix NPE (#17454)

[noreply] [BEAM-11105] Stateful watermark estimation (#17374)

[noreply] [BEAM-14304] implement parquetio to read/write parquet files (#17347)

[noreply] [BEAM-11104] Add Checkpointing split to Go SDK (#17386)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bb5342507e77b040f5bb402aa3628a180f7bf71e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bb5342507e77b040f5bb402aa3628a180f7bf71e # timeout=10
Commit message: "[BEAM-11104] Add Checkpointing split to Go SDK (#17386)"
 > git rev-list --no-walk 07f30d221e4b285b23b74c3509d77b62388b7bb4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1414584243313694531.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0427154008 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nogge7jgz3ljy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/686/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14343] Allow expansion service override in ExternalPythonTransform

[Heejong Lee] update

[Heejong Lee] allows remote host

[Heejong Lee] improve compatibility with python rowcoder

[ahmedabualsaud] added tempLocation to test pipeline options

[ahmedabualsaud] using tempRoot for temp bucket location

[ahmedabualsaud] small fixes

[noreply] [BEAM-14320] Update programming-guide w/Java GroupByKey example (#17369)

[noreply] Minor: Fix release script for `current` symlinks (#17457)

[noreply] Minor: fix typo (#17452)

[noreply] Change return type for PytorchInferenceRunner (#17460)

[noreply] [BEAM-13608] JmsIO dynamic topics feature (#17163)

[Heejong Lee] add test


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 07f30d221e4b285b23b74c3509d77b62388b7bb4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 07f30d221e4b285b23b74c3509d77b62388b7bb4 # timeout=10
Commit message: "Merge pull request #17418 from ihji/BEAM-14343"
 > git rev-list --no-walk 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2138889160257108013.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0426150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wdwuqmwavn4a2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/685/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
Commit message: "[BEAM-13953] added documentation for BQ Storage Write API (#17391)"
 > git rev-list --no-walk 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2971211683820978142.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0425150553 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dkbniaeygk6xm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/684/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13953] added documentation for BQ Storage Write API (#17391)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
Commit message: "[BEAM-13953] added documentation for BQ Storage Write API (#17391)"
 > git rev-list --no-walk 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3630225432436643427.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0424150518 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/u64xlap3bpvto

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/683/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-14321] SQL passes Null for Null aggregates

[noreply] Create apache-hop-with-dataflow.md

[noreply] Add files via upload

[noreply] Delete website/www/site/content/en/blog/apache-hop-with-dataflow

[noreply] Add files via upload

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[danielamartinmtz] Moved up get-credentials instruction for getting the kubeconfig file

[noreply] Merge pull request #17428: [BEAM-14326] Make sure BigQuery daemon thread

[noreply] [BEAM-14301] Add lint:ignore to noescape() func (#17355)

[noreply] [BEAM-14286] Remove unused vars in harness package (#17392)

[noreply] [BEAM-14327] Convert Results to QueryResults directly (#17398)

[noreply] [BEAM-14302] Simplify boolean check in fn.go (#17399)

[noreply] [BEAM-13983] Sklearn Loader for RunInference (#17368)

[noreply] Update authors.yml

[noreply] [BEAM-14358] add retry to connect to testcontainer (#17449)

[noreply] [BEAM-13106] Bump flink docs to 1.14 (#17430)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 # timeout=10
Commit message: "[BEAM-13106] Bump flink docs to 1.14 (#17430)"
 > git rev-list --no-walk 1540b9dccc714d242a51929eac20ced06b1108eb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4337040279559543750.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0423150511 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dlwr3gij6xngk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/682/display/redirect?page=changes>

Changes:

[mmack] [BEAM-14345] Force paranamer 2.8 for Spark Hadoop version tests to avoid

[kamil.bregula] Revert "[BEAM-14300] Fix Java precommit failure"

[kamil.bregula] Revert "Merge pull request #17223 from [BEAM-14215] Improve argument

[noreply] [BEAM-13984] Implement RunInference for PyTorch (#17196)

[noreply] [BEAM-13945] add json type support for java bigquery connector (#17209)

[Andrew Pilloud] [BEAM-14348] Upgrade to ZetaSQL 2022.04.1

[Andrew Pilloud] [BEAM-13735] Enable ZetaSQL tests for Java 17

[noreply] [BEAM-14346] Fix incorrect error case index in ret2() (#17425)

[noreply] [BEAM-14342] Fix wrong default buffer type in fn_runner (#17420)

[noreply] Updates opencensus-api dependency to the latest version - 0.31.0

[noreply] [BEAM-14306] Add unit testing to pane coder (#17370)

[noreply] Updated the dep and golden for screen diff integration tests (#17442)

[noreply] [BEAM-13657] Add python 3.6 update to CHANGES.md (#17435)

[noreply] Merge pull request #17438: [BEAM-8127] The GCP module to declare


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1540b9dccc714d242a51929eac20ced06b1108eb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1540b9dccc714d242a51929eac20ced06b1108eb # timeout=10
Commit message: "Merge pull request #17434: [BEAM-14348] Upgrade to ZetaSQL 2022.04.1"
 > git rev-list --no-walk 373c1c9cb96d77220494b6dbfb1467704639e700 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1812882945580531709.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0422150524 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qg64k25g4aymk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/681/display/redirect?page=changes>

Changes:

[vachan] Annotating Read API tests.

[bulat.safiullin] [BEAM-14247] [Website] add image

[bulat.safiullin] [BEAM-14247] [Website] center image

[mattcasters] BEAM-1857 : CHANGES.md entry for 2.38.0

[mmack] [BEAM-14335] Spotless Spark sources

[noreply] [BEAM-14112] Fixed ReadFromBigQuery with Interactive Beam (#17306)

[noreply] Update .asf.yaml (#17409)

[noreply] [BEAM-14336] Sickbay flight delays test - dataset seems to be missing

[noreply] [BEAM-14338] Update watermark unit tests to use time.Time.Equals()

[noreply] [BEAM-14328] Tweaks to "Differences from pandas" page (#17413)

[Andrew Pilloud] [BEAM-14253] Disable broken test pending Dataflow fix

[yiru] fix: BigQuery Storage Connector trace id population missing bracket

[noreply] [BEAM-14330] Temporarily disable the clusters auto-cleanup (#17400)

[noreply] Update Beam website to release 2.38.0 (#17378)

[noreply] [BEAM-14213] Add API and construction time validation for Batched DoFns

[noreply] Minor: Update release guide regarding archive.apache.org (#17419)

[noreply] [BEAM-14017] beam_PreCommit_CommunityMetrics_Cron test failing (#17396)

[noreply] BEAM-13582 Fixing broken links in the documentation (#17300)

[noreply] [BEAM-13657] Sunset python 3.6 (#17252)

[noreply] Removes unsupported Python 3.6 from the release validation script


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 373c1c9cb96d77220494b6dbfb1467704639e700 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 373c1c9cb96d77220494b6dbfb1467704639e700 # timeout=10
Commit message: "Removes unsupported Python 3.6 from the release validation script (#17397)"
 > git rev-list --no-walk e4d2050ccbaafb90428ab6c0cc494039f6282dae # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1122035712751130889.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0421153032 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 21s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cuo2vkswxb2b2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/680/display/redirect?page=changes>

Changes:

[andyye333] Change func to PTransform

[noreply] Populate actual dataflow job id to bigquery write trace id (#17130)

[relax] mark static thread as a daemon thread

[noreply] [BEAM-13866] Add miscellaneous exec unit tests (#17363)

[mmack] [BEAM-14323] Improve IDE integration of Spark cross version builds


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e4d2050ccbaafb90428ab6c0cc494039f6282dae (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e4d2050ccbaafb90428ab6c0cc494039f6282dae # timeout=10
Commit message: "Merge pull request #17389: [BEAM-14323] Improve IDE integration of Spark cross version builds"
 > git rev-list --no-walk 4b709d5456b105ffcc251da7a0a4a0b560491b1c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7197999154613109409.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0420150519 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v4kgbtrzo5nuo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #679

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/679/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14251] add output_coder_override to ExpansionRequest

[Heejong Lee] remove null

[rarokni] [BEAM-14307] Fix Slow Side input pattern bug in sample

[Heejong Lee] better error msg

[Heejong Lee] update from comments

[noreply] [BEAM-14316] Introducing KafkaIO.Read implementation compatibility

[noreply] [BEAM-14290] Address staticcheck warnings in the reflectx package

[noreply] [BEAM-14302] Simply bools in fn.go, genx_test.go (#17356)

[noreply] Merge pull request #17382: [BEAM-12356] Close DatasetService leak as


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b709d5456b105ffcc251da7a0a4a0b560491b1c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b709d5456b105ffcc251da7a0a4a0b560491b1c # timeout=10
Commit message: "Merge pull request #17382: [BEAM-12356] Close DatasetService leak as local variables"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1816320938704738309.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0419150559 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nowldasd6iea6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #678

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/678/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2054455528840450042.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0418150524 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mgaulp2r5ens4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #677

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/677/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5278740893319933529.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0417150514 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cfutlnurqnmmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #676

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/676/display/redirect?page=changes>

Changes:

[pandiana] BigQueryServicesImpl: reduce number of threads spawned by

[noreply] [BEAM-13204] Fix website bug where code tabs do not appear if the


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk ddd95c53738133fbb314cf9ba0ddd457774cfe28 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1915856738666970651.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0416150525 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j4ljxrfqpnooy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/675/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Upgrade to Gradle 7.4

[Kenneth Knowles] Remove Python module dependency on Dataflow worker

[noreply] [BEAM-11104] Pipe Continuation to DataSource level (#17334)

[noreply] [BEAM-11105] Basic Watermark Estimation (Wall Clock Observing) (#17267)

[noreply] Respect output coder for TextIO. (#17367)

[noreply] Merge pull request #17200 from [BEAM-12164]: fix the autoscaling backlog

[noreply] [BEAM-17035] Call python3 directly when it is available. (#17366)

[noreply] Merge pull request #17375: [BEAM-8691] Declare newer


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ddd95c53738133fbb314cf9ba0ddd457774cfe28 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ddd95c53738133fbb314cf9ba0ddd457774cfe28 # timeout=10
Commit message: "Merge pull request #17375: [BEAM-8691] Declare newer google-cloud-bigtable explicitly"
 > git rev-list --no-walk df6efe3644d08bda747d9d4434ab9e033073c8de # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6460769183051466021.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0415150512 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/o6c7eq3vamzfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #674

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/674/display/redirect?page=changes>

Changes:

[relax] handle changing schemas in Storage API sink

[noreply] Fix a couple style issues (#17361)

[noreply] [BEAM-14287] Clean up staticcheck warnings in graph/coder (#17337)

[noreply] Improvements to dataflow job service for non-Python jobs. (#17338)

[noreply] Bump minimist (#17290)

[noreply] Bump ansi-regex (#17291)

[noreply] Bump nanoid (#17292)

[noreply] Bump lodash (#17293)

[noreply] Bump url-parse (#17294)

[noreply] Bump moment (#17328)

[noreply] Merge pull request #15549 from [BEAM-11997] Changed RedisIO

[noreply] [BEAM-13925] Dont double assign committers if author or other reviewer

[noreply] [BEAM-13739] Remove deprecated shallow clone funcs (#17362)


------------------------------------------
[...truncated 55.89 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2729074 sha256=a0c96d986a30e55684d11de5af879c2c4c7c93bfcb4c6c028f0f0e1610e609f5
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.40 botocore-1.24.40 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220414155046410312-5521'
 createTime: '2022-04-14T15:50:52.458355Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-14_08_50_52-15213106085886930386'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0414150605'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-14T15:50:52.458355Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-14_08_50_52-15213106085886930386]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-14_08_50_52-15213106085886930386
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_08_50_52-15213106085886930386?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-14_08_50_52-15213106085886930386 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:57.151Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.138Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.167Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.223Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.256Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.284Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.336Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.420Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.456Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.485Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.517Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.556Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.657Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.798Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.827Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.857Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.891Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.921Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.977Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:59.014Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:59.046Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:20.765Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:38.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:59.578Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-14_08_50_52-15213106085886930386 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: df1d8c646fd643498988c61d602c9bb1 and timestamp: 1649952139.076525:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 104
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220414160223584946-4315'
 createTime: '2022-04-14T16:02:30.336067Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-14_09_02_29-17762620975207896110'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0414150605'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-14T16:02:30.336067Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-14_09_02_29-17762620975207896110]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-14_09_02_29-17762620975207896110
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_09_02_29-17762620975207896110?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-14_09_02_29-17762620975207896110 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:35.344Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.381Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.405Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.473Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.533Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.562Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.613Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.674Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.723Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.751Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.844Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.913Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.943Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.979Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.027Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.121Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.158Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.195Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.228Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.252Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.279Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.314Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.370Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.409Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.429Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:51.224Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:03:19.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:03:43.517Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-14_09_02_29-17762620975207896110 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_128607c9-fe5a-4b6c-a047-be8442e7fb95_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_08_50_52-15213106085886930386?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_09_02_29-17762620975207896110?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_128607c9-fe5a-4b6c-a047-be8442e7fb95_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 39m 40s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vvxue47quoppw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/673/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Add remaining Dataflow test suites for Python 3.9.

[Heejong Lee] [BEAM-14232] Only resolve artifacts in expanded environments for Java

[noreply] Fix test ordering issue (#17350)

[buqian] Do not pass null to MoreObjects.firstNonNull as default value

[ningkang0957] [BEAM-14288] Fixed flaky test

[noreply] [BEAM-14277] Disables Spanner change streams tests (#17346)

[noreply] [BEAM-14219] Run cleanup script to remove stale prebuilt SDK container

[Heejong Lee] [BEAM-14300] Fix Java precommit failure

[noreply] [BEAM-14116] Rollback "Chunk commit requests dynamically (#17004)"

[noreply] [BEAM-13982] A base class for run inference (#16970)

[ningkang0957] Enumerates all possible expected strings when asserting

[noreply] [BEAM-13966] Add pivot(), a non-deferred column operation on categorical


------------------------------------------
[...truncated 55.07 KB...]
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2728896 sha256=dfc0027726295999ba4c9a3cf1fc942d51fd05f0c088dddb214ceee856a81eed
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.39 botocore-1.24.39 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220413155045143512-1472'
 createTime: '2022-04-13T15:50:51.895646Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-13_08_50_51-2750334581216619093'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0413150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-13T15:50:51.895646Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-13_08_50_51-2750334581216619093]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-13_08_50_51-2750334581216619093
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_08_50_51-2750334581216619093?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-13_08_50_51-2750334581216619093 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:55.560Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.039Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.092Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.157Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.185Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.211Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.232Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.257Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.283Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.309Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.354Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.380Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.401Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.452Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.476Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.558Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.577Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.602Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.623Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.644Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.702Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.729Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.754Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:51:27.402Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:51:41.071Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:52:03.261Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.294Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.349Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.384Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.422Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.455Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-13_08_50_51-2750334581216619093 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fb514593573b4726b04aa514c6470ba2 and timestamp: 1649865728.5488813:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 189
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220413160213630288-4533'
 createTime: '2022-04-13T16:02:22.919929Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-13_09_02_21-534198698279891282'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0413150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-13T16:02:22.919929Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-13_09_02_21-534198698279891282]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-13_09_02_21-534198698279891282
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_09_02_21-534198698279891282?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-13_09_02_21-534198698279891282 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:30.922Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:36.668Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:41.688Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:41.847Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.072Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.119Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.174Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.234Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.262Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.440Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.590Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.618Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.673Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.707Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.772Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.895Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.919Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.951Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.985Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.018Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.077Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.105Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:59.225Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:17.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:17.145Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:27.489Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:52.257Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-13_09_02_21-534198698279891282 after 605 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0b154c4d4f7140a28ac6bc7e8f92f740 and timestamp: 1649866539.4664998:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0b154c4d4f7140a28ac6bc7e8f92f740 and timestamp: 1649866539.4664998:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_08_50_51-2750334581216619093?project=apache-beam-testing
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_09_02_21-534198698279891282?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4e39f9c6-0ca3-4d99-bb55-ef5e1ceba75a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 15s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rnm5ohj7u7qrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/672/display/redirect?page=changes>

Changes:

[kamil.bregula] [BEAM-14215] Improve argument validation in SnowflakeIO

[benjamin.gonzalez] [BEAM-14013] Add PreCommit Kotlin examples Jenkins Job

[Andrew Pilloud] [BEAM-13151] Support multiple layers of AutoValue nesting

[Heejong Lee] [BEAM-14233] Merge requirements from expanded response for Java External

[benjamin.gonzalez] [BEAM-14013] Add spark, direct, flink runners as triggers for Kotlin

[noreply] [BEAM-13898] Add tests to the pubsubx package. (#17324)

[noreply] [BEAM-14285] Clean up Staticcheck Warnings in io packages (#17336)

[noreply] [BEAM-14187] Fix concurrency issue in IsmReaderImpl (#17201)

[noreply] [BEAM-14288] Skip flaking test

[noreply] Simplify specifying additional dependencies in Go SDK in XLang IOs

[noreply] [BEAM-14240] Clean staticcheck warnings in runner packages (#17340)

[Daniel Oliveira] [BEAM-13538] Workaround to fix go-licenses crash.


------------------------------------------
[...truncated 55.11 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726953 sha256=185aece9c53fd7ac8c3aa5b1fea602c78bdad50a05199c45edc46d31f264cf10
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.38 botocore-1.24.38 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220412155045424579-7183'
 createTime: '2022-04-12T15:50:51.433400Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-12_08_50_51-8594613633872340345'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0412150517'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-12T15:50:51.433400Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-12_08_50_51-8594613633872340345]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-12_08_50_51-8594613633872340345
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_08_50_51-8594613633872340345?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-12_08_50_51-8594613633872340345 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:55.447Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.324Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.355Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.425Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.521Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.552Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.586Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.621Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.650Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.678Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.739Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.767Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.790Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.848Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.005Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.040Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.074Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.099Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.129Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.186Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.214Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.254Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:51:16.584Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:51:37.210Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:52:03.175Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-12_08_50_51-8594613633872340345 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a121ca77f114458e82a2bcaf1975a8c0 and timestamp: 1649779348.0970185:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 184
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220412160233301081-2366'
 createTime: '2022-04-12T16:02:40.177974Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-12_09_02_39-9535075498141607387'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0412150517'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-12T16:02:40.177974Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-12_09_02_39-9535075498141607387]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-12_09_02_39-9535075498141607387
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_09_02_39-9535075498141607387?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-12_09_02_39-9535075498141607387 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:44.728Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.495Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.528Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.593Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.676Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.704Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.769Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.848Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.889Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.962Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.994Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.030Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.293Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.383Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.419Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.449Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.477Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.504Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.535Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.589Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.613Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.662Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:57.408Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:03:28.185Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:03:55.397Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-12_09_02_39-9535075498141607387 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f15da87fb515482684796ad5aab99ec5 and timestamp: 1649780101.4088836:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 140
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f15da87fb515482684796ad5aab99ec5 and timestamp: 1649780101.4088836:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 140
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f3a9d77c-c0d9-43de-98cf-201544891df0_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_08_50_51-8594613633872340345?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_09_02_39-9535075498141607387?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 35s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bhwkz3sa73kgi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/671/display/redirect>

Changes:


------------------------------------------
[...truncated 55.04 KB...]
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726931 sha256=e4550d2ac440bcf3501694ebbd1f58ae8abc76b1f921f51c412b12bddf33cb40
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220411155039457170-4956'
 createTime: '2022-04-11T15:50:45.622571Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-11_08_50_45-3037064181937316376'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0411150533'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-11T15:50:45.622571Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-11_08_50_45-3037064181937316376]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-11_08_50_45-3037064181937316376
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_08_50_45-3037064181937316376?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-11_08_50_45-3037064181937316376 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:50:57.526Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.303Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.333Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.404Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.444Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.477Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.506Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.530Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.562Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.598Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.630Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.666Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.699Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.726Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.757Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.789Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.880Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.911Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.958Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.990Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.016Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.078Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.114Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.179Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:32.754Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:32.787Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:35.908Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:43.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:52:06.136Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-11_08_50_45-3037064181937316376 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8a48432880d2456c94042c3f37ad87f8 and timestamp: 1649692962.4810255:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 241
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220411160246625118-7269'
 createTime: '2022-04-11T16:02:52.629564Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-11_09_02_52-1350288650734643885'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0411150533'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-11T16:02:52.629564Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-11_09_02_52-1350288650734643885]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-11_09_02_52-1350288650734643885
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_09_02_52-1350288650734643885?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-11_09_02_52-1350288650734643885 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:02:58.432Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.174Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.221Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.305Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.375Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.403Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.468Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.534Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.585Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.933Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.967Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.032Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.103Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.143Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.171Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.203Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.235Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.290Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.318Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.388Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:09.104Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:46.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:04:13.930Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-11_09_02_52-1350288650734643885 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1997222e97c54425ae29e2a9aa719b0b and timestamp: 1649693760.147334:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 291
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1997222e97c54425ae29e2a9aa719b0b and timestamp: 1649693760.147334:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 291
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_08_50_45-3037064181937316376?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_09_02_52-1350288650734643885?project=apache-beam-testing
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_27430dfb-c522-4e8c-b409-ddad1549061e_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/l4hvctg4lcq6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/670/display/redirect?page=changes>

Changes:

[chamikaramj] Re-raise exceptions swallowed in several Python I/O connectors

[noreply] Merge pull request #16928: [BEAM-11971] Re add reverted timer


------------------------------------------
[...truncated 55.51 KB...]
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726931 sha256=410e4deeb053292f19fd7f62d9eb961c68144c2fcbd290f8ea58f6589ace5c83
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220410155046961528-7352'
 createTime: '2022-04-10T15:50:53.396748Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-10_08_50_52-2743639274288009278'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0410150552'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-10T15:50:53.396748Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-10_08_50_52-2743639274288009278]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-10_08_50_52-2743639274288009278
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_08_50_52-2743639274288009278?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_08_50_52-2743639274288009278 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:50:59.352Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.620Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.652Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.720Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.754Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.786Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.831Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.865Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.907Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.963Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.997Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.048Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.076Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.108Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.224Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.253Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.288Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.316Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.348Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.407Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.444Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.473Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:11.693Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:35.569Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:35.596Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:45.899Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:52:07.294Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.298Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.370Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.394Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.426Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.464Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:28.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:28.996Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:29.030Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_08_50_52-2743639274288009278 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fb4eabb67dc54e02b23ee2dcc5f4dc1e and timestamp: 1649606439.0959966:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 73
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220410160043272047-7201'
 createTime: '2022-04-10T16:00:49.823325Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-10_09_00_49-9968687670329101209'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0410150552'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-10T16:00:49.823325Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-10_09_00_49-9968687670329101209]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-10_09_00_49-9968687670329101209
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_09_00_49-9968687670329101209?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_09_00_49-9968687670329101209 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:01Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:06.883Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:06.937Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.013Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.093Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.123Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.188Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.255Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.294Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.331Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.363Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.394Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.427Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.505Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.538Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.570Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.637Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.669Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.753Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.795Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.825Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.885Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.922Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.955Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.012Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.043Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.092Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:20.624Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:38.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:38.548Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:48.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:02:12.988Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-10_09_00_49-9968687670329101209 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c5499bc635894d8da4716e2c8da10c57 and timestamp: 1649607289.397949:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 371
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c5499bc635894d8da4716e2c8da10c57 and timestamp: 1649607289.397949:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 371
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_08_50_52-2743639274288009278?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_09_00_49-9968687670329101209?project=apache-beam-testing
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_1e8e0ee6-f248-470b-9211-a4401e668858_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 25s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xpodygs2cqhcm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/669/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-11714] Change spotBugs jenkins config

[Robert Bradshaw] Cleanup docs on Shared.

[Kyle Weaver] Nit: correct description for precommit cron jobs.

[benjamin.gonzalez] [BEAM-11714] Add dummy class for testing

[benjamin.gonzalez] [BEAM-11714] Remove dummy class used for testing

[benjamin.gonzalez] [BEAM-11714] Spotbugs print toJenkins UI precommit_Java17

[noreply] [BEAM-13767] Remove eclipse plugin as it generates a lot of unused tasks

[noreply] [BEAM-10708] Updated beam_sql error message (#17314)

[noreply] [BEAM-14281] add as_deterministic_coder to nullable coder (#17322)

[noreply] Improvements to Beam/Spark quickstart. (#17129)

[chamikaramj] Disable BigQueryIOStorageWriteIT for Runner v2 test suite


------------------------------------------
[...truncated 54.50 KB...]
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726702 sha256=cf92c79df76bd217d7325b77dd1a7abe4bc3b1d043954d5bbd09d97eccb7b639
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220409155034276473-4613'
 createTime: '2022-04-09T15:50:40.630870Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-09_08_50_40-7816878964191486596'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0409150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-09T15:50:40.630870Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-09_08_50_40-7816878964191486596]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-09_08_50_40-7816878964191486596
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_08_50_40-7816878964191486596?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-09_08_50_40-7816878964191486596 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:44.357Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.126Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.156Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.223Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.256Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.286Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.315Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.349Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.389Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.428Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.497Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.523Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.555Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.643Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.759Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.798Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.828Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.863Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.896Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.954Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.978Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:46.008Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:20.829Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:31.153Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:59.634Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-09_08_50_40-7816878964191486596 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c3e164cdace44842aa0deadd34dbcfe6 and timestamp: 1649520218.2549164:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220409160342017577-5352'
 createTime: '2022-04-09T16:03:48.094291Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-09_09_03_47-1709887723878028'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0409150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-09T16:03:48.094291Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-09_09_03_47-1709887723878028]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-09_09_03_47-1709887723878028
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_09_03_47-1709887723878028?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-09_09_03_47-1709887723878028 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:53.319Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.385Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.425Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.489Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.542Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.570Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.637Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.704Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.735Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.784Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.910Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.952Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.984Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.049Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.072Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.116Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.147Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.180Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.211Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.240Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.263Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.343Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.377Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.421Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:23.579Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:32.755Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:32.783Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:43.054Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:05:03.386Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-09_09_03_47-1709887723878028 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c2056643112841679d19c24be368e29e and timestamp: 1649521110.2676654:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 405
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c2056643112841679d19c24be368e29e and timestamp: 1649521110.2676654:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 405
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_08_50_40-7816878964191486596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_09_03_47-1709887723878028?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_c22b07a0-fc3d-40f7-b6aa-56476efdd263_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 5s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/53vjzbdlcaqno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/668/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-10529] add java and generic components of nullable xlang tests

[johnjcasey] [BEAM-10529] fix test case

[johnjcasey] [BEAM-10529] add coders and typehints to support nullable xlang coders

[johnjcasey] [BEAM-10529] update external builder to support nullable coder

[johnjcasey] [BEAM-10529] clean up coders.py

[johnjcasey] [BEAM-10529] add coder translation test

[johnjcasey] [BEAM-10529] add additional check to typecoder to not accidentally

[johnjcasey] [BEAM-10529] add test to retrieve nullable coder from typehint

[johnjcasey] [BEAM-10529] run spotless

[johnjcasey] [BEAM-10529] add go nullable coder

[johnjcasey] [BEAM-10529] cleanup extra println

[johnjcasey] [BEAM-10529] improve comments, clean up python

[bulat.safiullin] [BEAM-13992] [Website] update Contribute/Code Contribution Guide page

[bulat.safiullin] [BEAM-13992] [Website] change text, transfer tag a

[bulat.safiullin] [BEAM-13992] [Website] change code tags

[bulat.safiullin] [BEAM-13992] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] change text and links, add empty lines

[bulat.safiullin] [BEAM-13991] [Website] change links, add contribute file

[bulat.safiullin] [BEAM-13991] [Website] add content, add styles

[bulat.safiullin] [BEAM-13991] [Website] add images, add styles, delete spaces

[bulat.safiullin] [BEAM-13991] [Website] change url and aliases, delete bullet points

[bulat.safiullin] [BEAM-13991] [Website] add empty line

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13992] [Website] change links, add text, add dots

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13991] [Website] change styles, change quotes

[bulat.safiullin] [BEAM-13991] [Website] change link color

[bulat.safiullin] [BEAM-13992] [Website] change text, delete whitespace

[bulat.safiullin] [BEAM-13991] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] update text

[bulat.safiullin] [BEAM-13991] [Website] added changes from PR 13992, changed get-starting

[shivrajw] [BEAM-14236] Parquet IO support for list to conform with Apache Parquet

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[mmack] [BEAM-14104] Support shard aware aggregation in Kinesis writer.

[noreply] [BEAM-11745] Fix author list rendering (#17308)

[noreply] [BEAM-14144] Record JFR profiles when GC thrashing is detected (#17151)

[noreply] Factors enable_prime flag in when checking use_unified_worker conditions

[noreply] [BEAM-11104] Add ProcessContinuation type to Go SDK (#17265)

[noreply] BEAM-13939: Restructure Protos to fix namespace conflicts (#16961)

[noreply] [BEAM-14270] Mark {Snowflake/BigQuery}Services as @Internal (#17309)

[noreply] [BEAM-13901] Add unit tests for graphx/cogbk.go

[noreply] [BEAM-14259, BEAM-14266] Remove unused function, replace use of ptypes

[noreply] [BEAM-14274] Fix staticcheck warnings in pipelinex (#17311)

[noreply] [BEAM-13857] Switched Go IT script to using Go flags for expansion

[noreply] Update python beam-master container image. (#17313)


------------------------------------------
[...truncated 54.61 KB...]
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726519 sha256=ccae4ff4c259122dd69cc421faa78fbaa098424171047aeccd03519080710d73
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.36 botocore-1.24.36 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220408155030531019-1817'
 createTime: '2022-04-08T15:50:37.351243Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-08_08_50_36-9069213566820303836'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0408150541'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-08T15:50:37.351243Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-08_08_50_36-9069213566820303836]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-08_08_50_36-9069213566820303836
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_08_50_36-9069213566820303836?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-08_08_50_36-9069213566820303836 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:42.826Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.760Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.794Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.845Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.876Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.905Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.944Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.971Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.021Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.117Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.153Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.443Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.480Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.506Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.553Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.586Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.636Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.660Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.692Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:53.228Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:51:30.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:51:51.645Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.226Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.306Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.334Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.364Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.398Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-08_08_50_36-9069213566820303836 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ac1e2554b8aa429090da43338d976e25 and timestamp: 1649433663.517995:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 81
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220408160108042854-7262'
 createTime: '2022-04-08T16:01:15.888663Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-08_09_01_14-10743278955627327941'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0408150541'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-08T16:01:15.888663Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-08_09_01_14-10743278955627327941]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-08_09_01_14-10743278955627327941
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_09_01_14-10743278955627327941?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-08_09_01_14-10743278955627327941 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:26.902Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.541Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.605Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.685Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.729Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.793Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.867Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.916Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.954Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.112Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.181Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.405Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.434Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.455Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.488Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.519Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.573Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.604Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.655Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:49.915Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:02:20.454Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:02:42.919Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-08_09_01_14-10743278955627327941 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4ada92d1ad434f32b40e2acb4b56f6f9 and timestamp: 1649434380.8059208:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 232
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4ada92d1ad434f32b40e2acb4b56f6f9 and timestamp: 1649434380.8059208:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 232
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_08_50_36-9069213566820303836?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_09_01_14-10743278955627327941?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_194dffaf-02f7-436e-8da4-e5104d6b0ee3_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 37s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qthdgep6cb3ey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/667/display/redirect?page=changes>

Changes:

[noreply] Avoid pr-bot state desync (#17299)

[noreply] [BEAM-14259] Clean up staticcheck warnings in the exec package (#17285)

[noreply] Minor: Prefer registered schema in SQL docs (#17298)

[Kyle Weaver] [BEAM-14262] Update plugins for Dockerized Jenkins.

[Kyle Weaver] Add ansicolor and ws-cleanup plugins.

[noreply] [Playground] add meta tags (#17207)

[noreply] fixes golint and deprecated issues in recent Go SDK import (#17304)

[noreply] [BEAM-14266] Replace deprecated ptypes package uses (#17302)

[noreply] [BEAM-11936] Fix rawtypes warnings in SnowflakeIO (#17257)

[noreply] Merge pull request #17262: [BEAM-14244] Use the supplied output

[noreply] [BEAM-13015] Lookup the container for the step once when registering

[noreply] [BEAM-14175] Log read loop abort at debug rather than error (#17183)


------------------------------------------
[...truncated 51.71 KB...]
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712736 sha256=48052be426da032e4b2a5f1810a85e470a2b510ec0a7c973fc36ecd2e03847ce
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.35 botocore-1.24.35 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220407155336510257-9757'
 createTime: '2022-04-07T15:53:43.410193Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-07_08_53_42-5342302405782967931'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0407150604'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-07T15:53:43.410193Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-07_08_53_42-5342302405782967931]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-07_08_53_42-5342302405782967931
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_08_53_42-5342302405782967931?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-07_08_53_42-5342302405782967931 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:48.394Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.539Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.572Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.634Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.661Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.706Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.740Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.788Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.828Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.954Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.979Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.025Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.054Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.182Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.212Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.238Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.258Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.291Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.341Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.374Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.914Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:54:26.684Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:54:34.696Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:55:00.505Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-07_08_53_42-5342302405782967931 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b8bdc2e1cd7c4924bbb76c5981b1e2f1 and timestamp: 1649347535.2959418:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 209
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220407160541526749-7201'
 createTime: '2022-04-07T16:05:47.608693Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-07_09_05_47-10043726317574425415'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0407150604'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-07T16:05:47.608693Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-07_09_05_47-10043726317574425415]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-07_09_05_47-10043726317574425415
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_09_05_47-10043726317574425415?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-07_09_05_47-10043726317574425415 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:54.565Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.263Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.292Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.374Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.462Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.490Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.558Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.626Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.666Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.694Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.727Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.795Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.826Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.860Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.898Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.921Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.055Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.099Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.202Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.270Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.316Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.357Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.389Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.415Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.460Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.488Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.532Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:06:22.354Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:06:42.107Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:07:09.049Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-07_09_05_47-10043726317574425415 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6b3d0a69224843ccabe1a27781c825e1 and timestamp: 1649348251.9933283:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6b3d0a69224843ccabe1a27781c825e1 and timestamp: 1649348251.9933283:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f15f21f7-81f8-4443-876a-905a088f3495_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_08_53_42-5342302405782967931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_09_05_47-10043726317574425415?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 6s
92 actionable tasks: 60 executed, 30 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nflg3u6isig6s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/666/display/redirect?page=changes>

Changes:

[bingyeli] update query

[Robert Bradshaw] [BEAM-14250] Fix coder registration for types defined in __main__.

[johnjcasey] [BEAM-14256] update SpEL dependency to 5.3.18.RELEASE

[johnjcasey] [BEAM-14256] remove .RELEASE

[dannymccormick] Fix dependency issue causing failures

[Kyle Weaver] [BEAM-9649] Add region option to Mongo Dataflow test.

[noreply] Allow get_coder(None).

[noreply] [BEAM-13015] Disable retries for fnapi grpc channels which otherwise

[noreply] [BEAM-13952] Sickbay

[noreply] BEAM-14235 parquetio module does not parse PEP-440 compliant Pyarrow

[noreply] [Website] Contribution guide page indent bug fix (#17287)

[noreply] [BEAM-10976] Document go sdk bundle finalization (#17048)

[noreply] [BEAM-13829] Expose status API from Go SDK Harness (#16957)


------------------------------------------
[...truncated 55.05 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: sqlalchemy, apache-beam
  Building wheel for sqlalchemy (setup.py): started
  Building wheel for sqlalchemy (setup.py): finished with status 'done'
  Created wheel for sqlalchemy: filename=SQLAlchemy-1.4.35-cp37-cp37m-linux_x86_64.whl size=1599143 sha256=1946c268ef57ed5bd1a946663d728aefa88db45ce36c199740c4f67d3d3be92b
  Stored in directory: /home/jenkins/.cache/pip/wheels/47/4b/54/e232479cdb4834a9fab3e9b9b11edb77472957215b129b8406
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712736 sha256=54ea4f216a97d52a36346486cdde726033d297e750c79b257b0fb516b6e061ee
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built sqlalchemy apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.34 botocore-1.24.34 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220406155040909269-7638'
 createTime: '2022-04-06T15:50:47.270007Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-06_08_50_46-15011121603976568263'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0406150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-06T15:50:47.270007Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-06_08_50_46-15011121603976568263]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-06_08_50_46-15011121603976568263
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_08_50_46-15011121603976568263?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-06_08_50_46-15011121603976568263 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:58.040Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.479Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.524Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.586Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.625Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.655Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.697Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.733Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.800Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.849Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.882Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.916Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.950Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.016Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.184Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.216Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.240Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.267Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.291Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.345Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.377Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.420Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:29.134Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:41.567Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:52:07.104Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-06_08_50_46-15011121603976568263 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: dd42e88ddddf42e28fce497b679faf9a and timestamp: 1649260943.952181:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 103
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220406160229136670-9391'
 createTime: '2022-04-06T16:02:35.583053Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-06_09_02_35-16822319082303818898'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0406150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-06T16:02:35.583053Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-06_09_02_35-16822319082303818898]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-06_09_02_35-16822319082303818898
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_09_02_35-16822319082303818898?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-06_09_02_35-16822319082303818898 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:46.214Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.105Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.157Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.234Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.302Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.334Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.395Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.440Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.481Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.507Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.571Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.646Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.678Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.711Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.785Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.847Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.925Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.950Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.981Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.009Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.030Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.064Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.117Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.183Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:07.285Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:31.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:58.075Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-06_09_02_35-16822319082303818898 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 58057af7156441f18992d2275c1c514e and timestamp: 1649261687.0942461:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 134
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 58057af7156441f18992d2275c1c514e and timestamp: 1649261687.0942461:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 134
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_cd87c33f-c69e-41ae-8814-228198d77b98_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_08_50_46-15011121603976568263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_09_02_35-16822319082303818898?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 20s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lop2jjqkh327g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/665/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-8970] Add docs to run wordcount example on portable Spark Runner

[Kiley Sok] Update python container version

[benjamin.gonzalez] [BEAM-8970] Add period to end of sentence

[Kyle Weaver] Add self-descriptive message for expected errors.

[noreply] Add --dataflowServiceOptions=enable_prime to useUnifiedWorker conditions

[noreply] [BEAM-10529] nullable xlang coder (#16923)

[noreply] Fix go fmt break in core/typex/special.go (#17266)

[noreply] [BEAM-5436] Add doc page on Go cross compilation. (#17256)

[noreply] Pr-bot Don't count all reviews as approvals (#17269)

[noreply] Fix postcommits (#17263)

[noreply] [BEAM-14241] Address staticcheck warnings in boot.go (#17264)

[noreply] [BEAM-14157] GrpcWindmillServer: Use stream specific boolean to do

[noreply] [BEAM-10582] Allow (and test) pyarrow 7 (#17229)

[noreply] [BEAM-13519] Solve race issues when the server responds with an error


------------------------------------------
[...truncated 55.76 KB...]
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712594 sha256=a564f1648ab35075d545ff88dc43fedd27dc0b04bd2e2a02513bd07230327713
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.33 botocore-1.24.33 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220405155036393877-9762'
 createTime: '2022-04-05T15:50:43.218242Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-05_08_50_42-2460069620461278920'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0405150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-05T15:50:43.218242Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-05_08_50_42-2460069620461278920]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-05_08_50_42-2460069620461278920
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_08_50_42-2460069620461278920?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-05_08_50_42-2460069620461278920 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:51.879Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.670Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.716Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.786Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.817Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.853Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.885Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.910Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.948Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.977Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.004Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.030Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.285Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.315Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.345Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.379Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.417Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.493Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.520Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.567Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:51:20.619Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:51:41.407Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:52:08.086Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-05_08_50_42-2460069620461278920 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7ddc5d9bb77e4419bd538bc081280910 and timestamp: 1649174555.0291286:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 136
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220405160239005481-5339'
 createTime: '2022-04-05T16:02:46.650564Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-05_09_02_46-16565799279862875471'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0405150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-05T16:02:46.650564Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-05_09_02_46-16565799279862875471]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-05_09_02_46-16565799279862875471
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_09_02_46-16565799279862875471?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-05_09_02_46-16565799279862875471 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:52.563Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.809Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.853Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.920Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.983Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.011Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.080Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.145Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.176Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.246Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.308Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.334Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.368Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.470Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.573Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.605Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.641Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.672Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.704Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.727Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.760Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.819Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.851Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.882Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:03:21.333Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:03:43.383Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:04:08.920Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-05_09_02_46-16565799279862875471 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_0bcd7cd6-45e7-4bdf-b57b-fd0f809d765b_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_08_50_42-2460069620461278920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_09_02_46-16565799279862875471?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_0bcd7cd6-45e7-4bdf-b57b-fd0f809d765b_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 4s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6ftbldjn5q324

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/664/display/redirect>

Changes:


------------------------------------------
[...truncated 55.00 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=8d5adc790d971e130221a2964f18e320acab7fbea22efe76f23ca0bb892d3c9d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220404155031720560-5310'
 createTime: '2022-04-04T15:50:39.612948Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-04_08_50_39-12155189229686329970'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0404150543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-04T15:50:39.612948Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-04_08_50_39-12155189229686329970]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-04_08_50_39-12155189229686329970
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_08_50_39-12155189229686329970?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-04_08_50_39-12155189229686329970 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:43.533Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.815Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.852Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.905Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.940Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.962Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.983Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.004Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.046Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.075Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.103Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.131Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.190Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.225Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.416Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.448Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.481Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.514Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.537Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.594Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.631Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.678Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:18.936Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:19.518Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:19.538Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:29.732Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:53.386Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-04_08_50_39-12155189229686329970 after 605 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 486c6265e7744882bcb97f1d7151b22c and timestamp: 1649088216.9733398:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220404160340872588-2382'
 createTime: '2022-04-04T16:03:47.627415Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-04_09_03_46-10844960694172558554'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0404150543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-04T16:03:47.627415Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-04_09_03_46-10844960694172558554]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-04_09_03_46-10844960694172558554
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_09_03_46-10844960694172558554?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-04_09_03_46-10844960694172558554 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:52.889Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.647Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.671Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.731Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.792Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.821Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.917Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.965Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.016Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.110Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.133Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.155Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.181Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.245Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.319Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.350Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.384Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.418Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.453Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.474Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.501Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.538Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.572Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.613Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.635Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.677Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:08.710Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:25.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:25.247Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:35.478Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:05:00.003Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-04_09_03_46-10844960694172558554 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e353fda9025d4b5fb11787dd6f257286 and timestamp: 1649089122.3941875:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 253
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e353fda9025d4b5fb11787dd6f257286 and timestamp: 1649089122.3941875:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 253
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_08_50_39-12155189229686329970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_09_03_46-10844960694172558554?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f5cbd081-9d14-4c4e-bff1-e2dfce2a28fa_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 21s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5q5rr72fizs2y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/663/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14143] Simplifies the ExternalPythonTransform API (#17101)


------------------------------------------
[...truncated 54.92 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=9a3e714c870932dc452b8a1806c1d3f9465c5a864b0015443987e50d9b580559
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220403155028934979-6810'
 createTime: '2022-04-03T15:50:34.590741Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-03_08_50_34-3763690064078693377'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0403150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-03T15:50:34.590741Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-03_08_50_34-3763690064078693377]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-03_08_50_34-3763690064078693377
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_08_50_34-3763690064078693377?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-03_08_50_34-3763690064078693377 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:41.006Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:41.967Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.021Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.083Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.121Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.149Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.183Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.216Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.247Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.293Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.326Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.361Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.398Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.429Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.455Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.536Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.568Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.599Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.634Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.657Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.715Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.747Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.792Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:14.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:14.800Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:17.926Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:25.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:48.136Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-03_08_50_34-3763690064078693377 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 697081be162843d9b094e46f89545d09 and timestamp: 1649001811.832278:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220403160336314397-3454'
 createTime: '2022-04-03T16:03:44.142056Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-03_09_03_42-11817159222089951991'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0403150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-03T16:03:44.142056Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-03_09_03_42-11817159222089951991]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-03_09_03_42-11817159222089951991
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_09_03_42-11817159222089951991?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-03_09_03_42-11817159222089951991 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.081Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.720Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.750Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.829Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.902Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.938Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.003Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.121Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.189Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.270Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.335Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.399Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.595Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.627Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.662Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.694Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.729Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.785Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.814Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.856Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:26.254Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:36.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:59.755Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-03_09_03_42-11817159222089951991 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a1dff3675aa84004b439c528cd3e997b and timestamp: 1649002591.7351391:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 119
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a1dff3675aa84004b439c528cd3e997b and timestamp: 1649002591.7351391:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 119
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_bec1b1ba-bd86-4cb4-93f2-0b2ea5367842_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_08_50_34-3763690064078693377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_09_03_42-11817159222089951991?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 9s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7egr6lhpxfbas

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #662

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/662/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14133] Fix potential NPE in BigQueryServicesImpl.getErrorInfo

[Robert Bradshaw] Revert "Revert "[BEAM-14038] Auto-startup for Python expansion service.

[Robert Bradshaw] Skip failing test for now.

[Kyle Weaver] [BEAM-14225] load balance jenkins jobs

[noreply] [BEAM-14153] Reshuffled Row Coder PCollection used as Side Input cause

[noreply] delint go sdk (#17247)

[Heejong Lee] add test

[noreply] Merge pull request #16841 from [BEAM-8823] Make FnApiRunner work by

[noreply] [BEAM-14192] Update legacy container version (#17210)

[noreply] Fix mishandling of API with BQIO (#17211)

[noreply] [BEAM-14221] Update documentation with Flink on Dataproc features

[Kiley Sok] Revert "[BEAM-14190] Python sends dataflow schema field"


------------------------------------------
[...truncated 55.64 KB...]
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=3ebda5ee9610893a97c131316bde15d66d185883ff33de271eca12a2609b369b
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220402155030975011-9943'
 createTime: '2022-04-02T15:50:37.159203Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-02_08_50_36-14063834008723349518'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0402150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-02T15:50:37.159203Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-02_08_50_36-14063834008723349518]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-02_08_50_36-14063834008723349518
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_08_50_36-14063834008723349518?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-02_08_50_36-14063834008723349518 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:42.726Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.831Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.863Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.957Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.004Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.034Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.058Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.091Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.122Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.184Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.218Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.277Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.415Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.450Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.483Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.516Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.541Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.658Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.692Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.727Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:59.553Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:51:23.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:51:50.332Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-02_08_50_36-14063834008723349518 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7cc9c8454d374700bfeef19f59cb695d and timestamp: 1648915419.3568988:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220402160344374877-3621'
 createTime: '2022-04-02T16:03:50.921308Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-02_09_03_50-8566020059848353751'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0402150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-02T16:03:50.921308Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-02_09_03_50-8566020059848353751]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-02_09_03_50-8566020059848353751
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_09_03_50-8566020059848353751?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-02_09_03_50-8566020059848353751 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:01.325Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.293Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.324Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.391Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.464Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.484Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.526Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.612Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.643Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.707Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.739Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.772Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.852Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.983Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.250Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.277Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.300Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.323Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.346Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.397Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.419Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.459Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:25.224Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:44.107Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:05:07.976Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-02_09_03_50-8566020059848353751 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_7d92fc2f-c1dd-416a-966d-48d7e8867ccd_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_08_50_36-14063834008723349518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_09_03_50-8566020059848353751?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_7d92fc2f-c1dd-416a-966d-48d7e8867ccd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 42s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/c3a4gvyrfqeng

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/661/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-14164] [Website] change styles

[Andrew Pilloud] [BEAM-14190] Python sends dataflow schema field

[noreply] [BEAM-14179] Fix possibly null value

[noreply] [BEAM-12815] Try to fix flaky Flink Post Commit (#17227)

[noreply] Add a portable job server that proxies the Dataflow backend. (#17189)

[noreply] [BEAM-14130] Implement JupyterLab extension for managing Dataproc

[Andrew Pilloud] [BEAM-13741] Remove forced calcite dependency from BaseBeamTable

[noreply] [BEAM-13951] Update release guide with pointers on updating


------------------------------------------
[...truncated 55.44 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2703619 sha256=27785b7966e64159054ac0b59e7fecbe4f4054d61a53f63b3dfb53d5d50aae85
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.31 botocore-1.24.31 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220401155046602392-6626'
 createTime: '2022-04-01T15:50:52.719080Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-01_08_50_52-5073730340269013657'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0401150532'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-01T15:50:52.719080Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-01_08_50_52-5073730340269013657]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-01_08_50_52-5073730340269013657
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_08_50_52-5073730340269013657?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-01_08_50_52-5073730340269013657 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:50:58.558Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.277Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.379Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.460Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.527Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.560Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.583Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.661Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.685Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.709Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.743Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.771Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.933Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.986Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.022Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.047Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.079Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.130Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.184Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:28.339Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:34.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:34.856Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:45.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:52:08.877Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-01_08_50_52-5073730340269013657 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a667e2693702471fa5f954cc815adb06 and timestamp: 1648829024.9020364:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220401160348627794-6554'
 createTime: '2022-04-01T16:03:56.957762Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-01_09_03_56-765668334667913902'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0401150532'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-01T16:03:56.957762Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-01_09_03_56-765668334667913902]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-01_09_03_56-765668334667913902
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_09_03_56-765668334667913902?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-01_09_03_56-765668334667913902 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:03.405Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:10.309Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.333Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.736Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.812Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.843Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.919Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.988Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.032Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.072Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.207Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.242Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.274Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.345Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.378Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.444Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.476Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.542Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.582Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.614Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.636Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.670Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.735Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.772Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.854Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:22.206Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:05:01.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:05:25.999Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-01_09_03_56-765668334667913902 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9841426c87fa4ed8b4e1b7c7645fe224 and timestamp: 1648829844.7056231:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9841426c87fa4ed8b4e1b7c7645fe224 and timestamp: 1648829844.7056231:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4453e2f7-2902-4a40-a9b2-6505120e9690_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_08_50_52-5073730340269013657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_09_03_56-765668334667913902?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 58s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fmyg3c77eij6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #660

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/660/display/redirect?page=changes>

Changes:

[brachipa] [BEAM-14094]Fix null pointer exception in HllCountInitFn

[brachipa] [BEAM-14094]Fix null pointer exception in HllCountInitFn

[noreply] Merge pull request #17149 from [BEAM-13883] [Playground] Increase test

[Kiley Sok] ignore test

[noreply] [BEAM-13948] Add unstack(), a non-deferred column operation on

[noreply] [BEAM-10976] Bundle finalization: E2E support (#17045)


------------------------------------------
[...truncated 55.28 KB...]
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2698067 sha256=c2df61a0424fc39591e2a73f77aab5d2fc598250a10c747fc874f59609085688
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.30 botocore-1.24.30 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.33 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220331155044822732-6144'
 createTime: '2022-03-31T15:50:52.326556Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-31_08_50_51-9882255125219877299'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0331150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-31T15:50:52.326556Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-31_08_50_51-9882255125219877299]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-31_08_50_51-9882255125219877299
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_08_50_51-9882255125219877299?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-31_08_50_51-9882255125219877299 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:50:56.483Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.520Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.620Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.685Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.717Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.747Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.788Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.857Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.890Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.020Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.130Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.162Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.189Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.222Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.254Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.313Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.341Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.381Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:17.875Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:47.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:52:11.409Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-31_08_50_51-9882255125219877299 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d6b729a782fe4977b59584ac1bac9c13 and timestamp: 1648742655.0827138:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 96
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220331160419055933-9268'
 createTime: '2022-03-31T16:04:26.398123Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-31_09_04_25-4322660594425624147'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0331150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-31T16:04:26.398123Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-31_09_04_25-4322660594425624147]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-31_09_04_25-4322660594425624147
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_09_04_25-4322660594425624147?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-31_09_04_25-4322660594425624147 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:31.979Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.470Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.497Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.580Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.640Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.668Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.728Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.818Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.858Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.888Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.917Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.969Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.002Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.111Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.215Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.276Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.296Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.316Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.375Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.397Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.446Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.481Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.504Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:00.127Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:19.129Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:46.115Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-31_09_04_25-4322660594425624147 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1e1a2131175b4408a6574afeaeec8b62 and timestamp: 1648743616.770134:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 307
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1e1a2131175b4408a6574afeaeec8b62 and timestamp: 1648743616.770134:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 307
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_08_50_51-9882255125219877299?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_09_04_25-4322660594425624147?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_bf134323-af7b-4376-ab96-fa3ad834cbae_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w3nixy4bf6twq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #659

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/659/display/redirect?page=changes>

Changes:

[vachan] Update display data to include BQ information.

[noreply] Revert "[BEAM-14084] iterable_input_value_types changed from list to

[egalpin] [BEAM-14003] Adds compat for Elasticsearch 8.0.0

[egalpin] [BEAM-13136] Removes support for Elasticsearch 2.x

[Valentyn Tymofieiev] Ensure the removed option prebuild_sdk_container_base_image not used on

[noreply] Merge pull request #17202 from [BEAM-14194]: Disallow autoscaling for

[noreply] Merge pull request #17080 from [BEAM-13880] [Playground] Increase test

[noreply] Merge pull request #17050 from [BEAM-13877] [Playground] Increase test

[noreply] [BEAM-14200] Improve SamzaJobInvoker extensibility (#17212)

[noreply] Merge pull request #17148 from [BEAM-14042] [playground] Scroll imports

[noreply] [BEAM-13918] Increase datastoreio go sdk unit test coverage (#17173)

[noreply] Merge pull request #16819: [BEAM-13806] Adding test suite for Go x-lang


------------------------------------------
[...truncated 55.40 KB...]
  Using cached more_itertools-8.12.0-py3-none-any.whl (54 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2696577 sha256=0c0b5adc4a1e1dfac126265ddf023a269ec5c8d9de43f9f351aa7c5902cc0e55
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.29 botocore-1.24.29 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220330155035431715-7021'
 createTime: '2022-03-30T15:50:43.091910Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-30_08_50_42-14528617402994523475'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0330150617'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-30T15:50:43.091910Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-30_08_50_42-14528617402994523475]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-30_08_50_42-14528617402994523475
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_08_50_42-14528617402994523475?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-30_08_50_42-14528617402994523475 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:52.213Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.760Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.785Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.849Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.886Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.915Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.948Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.982Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.026Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.105Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.143Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.400Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.439Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.467Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.503Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.537Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.597Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.626Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.657Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:51:05.418Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:51:46.751Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:52:10.840Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-30_08_50_42-14528617402994523475 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 06cf7d105b7f4fda98fa7ccc7b5b06d1 and timestamp: 1648656240.7515266:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220330160405696596-1170'
 createTime: '2022-03-30T16:04:13.427193Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-30_09_04_13-606840261092082655'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0330150617'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-30T16:04:13.427193Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-30_09_04_13-606840261092082655]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-30_09_04_13-606840261092082655
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_09_04_13-606840261092082655?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-30_09_04_13-606840261092082655 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:18.935Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.275Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.329Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.415Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.499Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.527Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.591Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.681Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.720Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.853Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.897Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.918Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.969Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.024Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.067Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.099Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.127Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.156Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.208Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.242Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.274Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.356Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.394Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.445Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:50.276Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:05:08.173Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:05:34.676Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-30_09_04_13-606840261092082655 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 169c243ef62e494a946d94b7ccb7c904 and timestamp: 1648657182.8920436:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 267
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 169c243ef62e494a946d94b7ccb7c904 and timestamp: 1648657182.8920436:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 267
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_08_50_42-14528617402994523475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_09_04_13-606840261092082655?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_8dce2603-2890-4d74-9c68-7372a0fda6bb_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 20s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wbdzybce2jtws

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #658

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/658/display/redirect?page=changes>

Changes:

[hengfeng] feat: remove the metadata table after the pipeline finishes

[thiagotnunes] test: add test for metadata table dropping

[noreply] [BEAM-14177] Fix GBK re-iteration caching for portable runners. (#17184)

[noreply] Merge pull request #17187: [BEAM-14181] Make sure to evict connections

[noreply] Only reset transform.label if it is correctly assigned (#17192)

[noreply] [BEAM-12641] Use google-auth instead of oauth2client for GCP auth

[Robert Bradshaw] [BEAM-14163] Fix typo in single core per container logic.

[chamikaramj] Convert URLs to local jars when constructing filesToStage

[thiagotnunes] test: disable SpannerIO.readChangeStream test

[noreply] Merge pull request #17164 from [BEAM-14140][Playground] Fix Deploy

[noreply] Merge pull request #16855 from [BEAM-13938][Playground] Increase test

[noreply] [BEAM-13314]Revise recommendations to manage Python pipeline


------------------------------------------
[...truncated 54.41 KB...]
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2696331 sha256=7dbaec09403d781bb2038730197786cf3b7429f103fa94f0b16b807761a74aba
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.28 botocore-1.24.28 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220329155207951052-6507'
 createTime: '2022-03-29T15:52:14.581687Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-29_08_52_14-11276422842038304298'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0329150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-29T15:52:14.581687Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-29_08_52_14-11276422842038304298]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-29_08_52_14-11276422842038304298
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_08_52_14-11276422842038304298?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-29_08_52_14-11276422842038304298 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:20.689Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.039Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.089Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.179Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.241Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.287Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.321Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.361Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.446Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.539Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.573Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.688Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.733Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.877Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.932Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.974Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.047Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.086Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.202Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.271Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.308Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:54.794Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:53:10.415Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:53:34.202Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-29_08_52_14-11276422842038304298 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 739a2f29538c4739ba0cc214faefdd44 and timestamp: 1648569907.4379876:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220329160512633711-9694'
 createTime: '2022-03-29T16:05:19.050028Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-29_09_05_18-3961140471685971338'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0329150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-29T16:05:19.050028Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-29_09_05_18-3961140471685971338]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-29_09_05_18-3961140471685971338
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_09_05_18-3961140471685971338?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-29_09_05_18-3961140471685971338 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:24.113Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.010Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.043Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.106Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.204Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.239Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.294Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.347Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.384Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.419Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.442Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.529Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.644Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.706Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.845Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.882Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.923Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.974Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.026Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.063Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.125Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.187Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.244Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:36.991Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:06:10.164Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:06:35.744Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-29_09_05_18-3961140471685971338 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4eca1ae65ea84f6f8c8edaae99903ff9 and timestamp: 1648570921.8876903:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 351
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4eca1ae65ea84f6f8c8edaae99903ff9 and timestamp: 1648570921.8876903:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 351
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_e6af68a0-c6e7-483b-8c3c-4d4606db5eaf_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_08_52_14-11276422842038304298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_09_05_18-3961140471685971338?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 39s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/aoqce3afhscwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #657

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/657/display/redirect?page=changes>

Changes:

[noreply] Minor: Add warning about pubsub client to Beam 2.36.0 blog (#17188)


------------------------------------------
[...truncated 55.66 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=9a24c2940a225b86a6b44ca9d4bb5cf3f55046881f49947379a1f90e15c0be4f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220328155039627465-7907'
 createTime: '2022-03-28T15:50:45.040438Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-28_08_50_44-6445254387771334530'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0328150538'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-28T15:50:45.040438Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-28_08_50_44-6445254387771334530]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-28_08_50_44-6445254387771334530
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_08_50_44-6445254387771334530?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-28_08_50_44-6445254387771334530 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:56.886Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.712Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.770Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.828Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.865Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.901Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.932Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.973Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.045Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.070Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.133Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.166Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.297Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.330Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.386Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.421Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.455Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.564Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.591Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.620Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:51:30.532Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:51:38.465Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:52:02.872Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.791Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.838Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.875Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.916Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.951Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-28_08_50_44-6445254387771334530 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a80a679d3bf244a583d9bf116787aa2f and timestamp: 1648483367.3406136:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 110
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220328160251530260-3259'
 createTime: '2022-03-28T16:02:57.547576Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-28_09_02_57-2745958637978817641'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0328150538'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-28T16:02:57.547576Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-28_09_02_57-2745958637978817641]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-28_09_02_57-2745958637978817641
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_09_02_57-2745958637978817641?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-28_09_02_57-2745958637978817641 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:04.546Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.631Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.743Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.806Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.874Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.903Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.970Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.116Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.164Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.197Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.328Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.361Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.414Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.446Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.522Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.558Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.595Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.621Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.653Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.687Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.751Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.775Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.804Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:31.882Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:42.275Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:42.308Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:52.539Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:04:14.937Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-28_09_02_57-2745958637978817641 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 75d0113e14884cc1b7f3f2d836a66644 and timestamp: 1648484182.5431263:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 125
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 75d0113e14884cc1b7f3f2d836a66644 and timestamp: 1648484182.5431263:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 125
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_08_50_44-6445254387771334530?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_09_02_57-2745958637978817641?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_17c5e214-3015-45a5-9087-fa0860095e93_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kyg2nzlxwgygy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #656

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/656/display/redirect>

Changes:


------------------------------------------
[...truncated 55.75 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=5a6a666952937f3d8e7a3cf289d52c3f9532319f6fd4fbcedf2d9305c100ae81
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220327155041662864-2441'
 createTime: '2022-03-27T15:50:48.179114Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-27_08_50_47-4711899144640281827'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0327150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-27T15:50:48.179114Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-27_08_50_47-4711899144640281827]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-27_08_50_47-4711899144640281827
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_08_50_47-4711899144640281827?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-27_08_50_47-4711899144640281827 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:53.738Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.697Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.785Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.881Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.921Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.957Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.993Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.054Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.277Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.337Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.497Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.542Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.580Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.745Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.832Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.874Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.909Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.942Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.025Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.065Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.115Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:30.322Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:30.362Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:32.660Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:40.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:52:05.790Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:15.965Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.124Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.171Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.259Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.313Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-27_08_50_47-4711899144640281827 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fa6d761871f742dc9e713ab85868701b and timestamp: 1648396974.9372354:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 119
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220327160301551488-3847'
 createTime: '2022-03-27T16:03:09.204265Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-27_09_03_08-18240558474435659941'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0327150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-27T16:03:09.204265Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-27_09_03_08-18240558474435659941]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-27_09_03_08-18240558474435659941
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_09_03_08-18240558474435659941?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-27_09_03_08-18240558474435659941 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:14.896Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.114Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.149Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.214Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.261Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.290Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.358Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.401Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.430Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.494Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.528Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.639Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.665Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.697Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.752Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.794Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.827Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.872Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.896Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.942Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.973Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.005Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.038Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.091Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.122Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.163Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:46.555Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:56.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:04:20.791Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-27_09_03_08-18240558474435659941 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 080604e9c65f49e1b06e830927112dae and timestamp: 1648397774.3947783:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 92
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 080604e9c65f49e1b06e830927112dae and timestamp: 1648397774.3947783:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 92
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_08_50_47-4711899144640281827?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_09_03_08-18240558474435659941?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_65637f39-760d-49d9-94e8-bf85955f8f25_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 52s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5jmplau6i4eve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #655

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/655/display/redirect?page=changes>

Changes:

[ryanthompson591] iterable_input_value_types will now be an iterable, I don't anticipate

[marco.robles] [BEAM-8218] PulsarIO Connector

[benjamin.gonzalez] [BEAM-12572] Change examples jobs to run as cron jobs

[benjamin.gonzalez] [BEAM-12572] SpotlessApply

[Robert Bradshaw] [BEAM-14171] More explicit asserts in CoGBKResult.

[Robert Bradshaw] Add some comments.

[noreply] [BEAM-14160] Parse filesToStage in Java expansion service (#17167)

[chamikaramj] Mapped JOB_STATE_RESOURCE_CLEANING_UP to RESOURCE_CLEANING_UP in Python

[noreply] Explicitly import estimator from tensorflow (#17168)


------------------------------------------
[...truncated 55.62 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=095ff4f884c2ad0d5f1869f9147e985924402f8d9517b85756d3f9936cfe7fc9
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220326155042292667-2331'
 createTime: '2022-03-26T15:50:49.875405Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-26_08_50_49-8532487721309707138'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0326150513'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-26T15:50:49.875405Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-26_08_50_49-8532487721309707138]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-26_08_50_49-8532487721309707138
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_08_50_49-8532487721309707138?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-26_08_50_49-8532487721309707138 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:55.069Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.077Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.106Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.173Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.226Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.255Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.277Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.301Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.340Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.425Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.458Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.526Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.575Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.654Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.684Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.713Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.746Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.786Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.834Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.866Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.904Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:51:25.368Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:51:39.529Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:52:05.552Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.727Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.791Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.832Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.882Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.919Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-26_08_50_49-8532487721309707138 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6464f000845e4070acca5329e1fd7143 and timestamp: 1648310577.4501204:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 108
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220326160301290512-5660'
 createTime: '2022-03-26T16:03:08.513074Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-26_09_03_08-10414373202299082515'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0326150513'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-26T16:03:08.513074Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-26_09_03_08-10414373202299082515]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-26_09_03_08-10414373202299082515
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_09_03_08-10414373202299082515?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-26_09_03_08-10414373202299082515 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:13.739Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.453Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.492Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.568Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.637Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.672Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.779Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.864Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.918Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.951Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.123Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.155Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.217Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.251Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.314Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.403Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.449Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.478Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.512Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.545Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.615Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.651Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.680Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:32.455Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:56.785Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:04:21.619Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-26_09_03_08-10414373202299082515 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9d68a4266ec94d059beda3ee85f4f608 and timestamp: 1648311393.4949863:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 99
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9d68a4266ec94d059beda3ee85f4f608 and timestamp: 1648311393.4949863:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 99
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_08_50_49-8532487721309707138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_09_03_08-10414373202299082515?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_403a40d6-3d73-4c90-b464-ed934d7d2bb3_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 11s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bgzbatlsr7t76

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #654

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/654/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14139] Drop support for Flink 1.11.

[Kyle Weaver] [BEAM-14139] Remove obsolete reference to Flink 1.11.

[Kyle Weaver] [BEAM-14139] Update list of supported Flink versions.

[Kyle Weaver] [BEAM-14139] Update CHANGES.md

[noreply] [BEAM-14157] Don't call requestObserver.onNext on a closed windmill

[noreply] Minor: Make IOTypeHints a real NamedTuple (#17174)

[noreply] [BEAM-14172] Update tox.ini for pydocs (#17176)

[noreply] [BEAM-14065] Upgrade vendored bytebuddy to version 1.12.8 (#17028)


------------------------------------------
[...truncated 55.61 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695940 sha256=ee580f7c9136b480b72044fc5692c7a31faf2f852a52a5d1a3efcb5524b140dc
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.26 botocore-1.24.26 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220325155042386065-4565'
 createTime: '2022-03-25T15:50:51.112843Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-25_08_50_49-14711236213746615654'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0325150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-25T15:50:51.112843Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-25_08_50_49-14711236213746615654]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-25_08_50_49-14711236213746615654
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_08_50_49-14711236213746615654?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-25_08_50_49-14711236213746615654 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.135Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.951Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.054Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.119Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.153Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.189Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.221Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.261Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.436Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.494Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.683Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.769Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.811Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.846Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.893Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.951Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:00.014Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:00.035Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:02.093Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:13.520Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:42.249Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:52:07.608Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:16.909Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.222Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.278Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.330Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.373Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-25_08_50_49-14711236213746615654 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ce215b7f561645c6be63ed72ab4f525a and timestamp: 1648224162.8593292:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 117
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220325160247235831-9950'
 createTime: '2022-03-25T16:02:54.267613Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-25_09_02_53-5305199909241581353'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0325150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-25T16:02:54.267613Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-25_09_02_53-5305199909241581353]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-25_09_02_53-5305199909241581353
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_09_02_53-5305199909241581353?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-25_09_02_53-5305199909241581353 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:04.048Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.113Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.140Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.229Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.292Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.320Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.395Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.450Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.491Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.536Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.570Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.636Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.719Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.743Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.782Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.813Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.847Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.906Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.936Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.967Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.990Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.011Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.056Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.094Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.116Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:34.138Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:37.130Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:37.263Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:47.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:04:10.048Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-25_09_02_53-5305199909241581353 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 37b1e39389aa48138b07c3dfe5914a6e and timestamp: 1648224953.9698627:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 37b1e39389aa48138b07c3dfe5914a6e and timestamp: 1648224953.9698627:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_08_50_49-14711236213746615654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_09_02_53-5305199909241581353?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_fdeede52-ff6f-4b0c-9e9c-9fcddfbd3150_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 33s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ym5nv7kpz3q2o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #653

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/653/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-13976] [Website] update homepage

[bulat.safiullin] [BEAM-13976] [Website] update homepage, add logo

[bulat.safiullin] [BEAM-13976] [Website] update text

[bulat.safiullin] [BEAM-13976] [Website] Update Community landing page

[bulat.safiullin] [BEAM-13979] [Website] Update Community/Contact us page

[bulat.safiullin] [BEAM-13979] [Website] update title

[bulat.safiullin] [BEAM-13979] [Website] delete space

[bulat.safiullin] [BEAM-13979] [Website] add Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] delete Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] change navbar css links rules, delete links from

[bulat.safiullin] [BEAM-13977] [Website] delete available-contact-channels on mobile

[bulat.safiullin] [BEAM-13976] [Website] change padding size between the sections

[bulat.safiullin] [BEAM-13976] [Website] change title to capital letters

[bulat.safiullin] [BEAM-13976] [Website] change title

[bulat.safiullin] [BEAM-14040] [Website] create new page, add link

[bulat.safiullin] [BEAM-13977] [Website] change title

[bulat.safiullin] [BEAM-13979] [Website] change text

[bulat.safiullin] [BEAM-13976] [Website] change text

[bulat.safiullin] [BEAM-13977] [Website] change text, add capital letters

[bulat.safiullin] [BEAM-13976] [Website] add playground sass, change text-align

[bulat.safiullin] [BEAM-14040] [Website] add io connectors table

[bulat.safiullin] [BEAM-13976] [Website] add playground section, add empty line

[bulat.safiullin] [BEAM-14040] [Website] add overflow to css, add table content

[bulat.safiullin] [BEAM-14040] [Website] change ✘ for ✔, add license, add br

[bulat.safiullin] [BEAM-14040] [Website] add empty line

[bulat.safiullin] [BEAM-14040] [Website] change td

[bulat.safiullin] [BEAM-14041] [Website] update built io transforms

[bulat.safiullin] [BEAM-14041] [Website] move connectors from Miscellaneous to Database

[bulat.safiullin] [BEAM-14040] [Website] change links color

[danielamartinmtz] Updated metrics' CronJob API to use the latest batch version.

[bulat.safiullin] [BEAM-14041] [Website] change IO from go to java

[bulat.safiullin] [BEAM-14040] [Website] change links, change specific version to current

[danielamartinmtz] Updated cluster to test in metrics-upgrade-clone in BeamMetrics_Publish

[aydar.zaynutdinov] [BEAM-13976][Website]

[aydar.zaynutdinov] [BEAM-14040][Website]

[aydar.zaynutdinov] [BEAM-14041][Website]

[danielamartinmtz] Updated StateFulSet k8s obejct in cassandra-svc-statefulset.yaml file in

[danielamartinmtz] Updated documentation including cluster specs.

[noreply] Beam 13058 k8s apis upgrade - elasticsearch (#18)

[danielamartinmtz] Removed code used for testing.

[danielamartinmtz] Removed code used for testing in job_PostCommit_BeamMetrics_Publish

[noreply] Beam 13058 k8s apis upgrade - Adding Basic Auth details in documentation

[Pablo Estrada] [BEAM-14151] Excluding Spanner CDC tests from Dataflow V1 suite

[danielamartinmtz] Added comments in initContainers and remove unused code in elasticsearch

[noreply] [BEAM-12697] Add primitive field generation from IR to SBE extension

[noreply] [BEAM-13889] Add test cases to jsonx package (#17124)

[noreply] Remove unreachable code in container.go (#17166)

[noreply] Add ability to handle streaming input to AvroSchemaIOProvider (#17126)

[noreply] [BEAM-12898] Flink Load Tests failure- UncheckedExecutionException -

[Daniel Oliveira] Moving to 2.39.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 55.48 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695955 sha256=7b835f32ce796b07e7c1f7dacb678bfad0c4315420300002f7937db0ce42335a
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.25 botocore-1.24.25 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.45.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220324155116486158-3179'
 createTime: '2022-03-24T15:51:23.539504Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-24_08_51_23-2019057477283921742'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0324150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-24T15:51:23.539504Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-24_08_51_23-2019057477283921742]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-24_08_51_23-2019057477283921742
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_08_51_23-2019057477283921742?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-24_08_51_23-2019057477283921742 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:42.259Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.584Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.617Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.689Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.732Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.761Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.786Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.842Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.892Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.948Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.983Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.096Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.237Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.279Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.314Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.347Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.383Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.462Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.496Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.561Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:14.415Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:18.007Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:18.034Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:28.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:48.461Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.160Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.237Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.278Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.323Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.348Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-24_08_51_23-2019057477283921742 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7b6876e3ede44d4d927ec766a2df853b and timestamp: 1648137820.6557474:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 107
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220324160345573769-1024'
 createTime: '2022-03-24T16:03:51.844999Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-24_09_03_51-3352884538278352834'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0324150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-24T16:03:51.844999Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-24_09_03_51-3352884538278352834]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-24_09_03_51-3352884538278352834
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_09_03_51-3352884538278352834?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-24_09_03_51-3352884538278352834 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:58.338Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.469Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.503Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.562Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.635Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.675Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.732Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.811Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.843Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.867Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.129Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.216Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.298Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.329Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.402Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.437Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.463Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.522Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.556Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.608Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.641Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.696Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:12.425Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:40.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:05:06.417Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-24_09_03_51-3352884538278352834 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 759ba600232f4621a7c8b6780d246289 and timestamp: 1648138627.1467967:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 759ba600232f4621a7c8b6780d246289 and timestamp: 1648138627.1467967:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_08_51_23-2019057477283921742?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_09_03_51-3352884538278352834?project=apache-beam-testing
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_93974781-8d68-4831-ab20-0388ab0d181c_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 44s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tbxjjxwwrioy2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #652

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/652/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13232] Close clients properly in KinesisSource. Also use lazy init

[noreply] [BEAM-14141] Set Interactive Beam to use the default Dataproc image

[noreply] BEAM-14115 - Update find criteria limited to _id (#17102)

[chamikaramj] Disable BigQueryIOStorageWriteIT for general Java post-commit

[noreply] Revert "[BEAM-14038] Auto-startup for Python expansion service.

[noreply] Minor: Bump timeout for Java PreCommit (#17157)

[noreply] [BEAM-14152] Disable flaky

[noreply] Fixing a small bug in TypedSchemaTransformTest that caused it to flake.

[noreply] [BEAM-14116] Catch MonitoringInfoMetricName null keys or values in the

[noreply] [BEAM-14129] Restructure SubscriptionPartitionLoader to use a manual SDF

[noreply] [BEAM-13015] Avoid repeated weighing of StateKey in

[noreply] Add option to add modules to JDK add-open (#17110)

[noreply] [BEAM-13015] Clarify ownership of the list for state caching across

[noreply] [BEAM-14134] Optimize memory allocations for various core coders

[noreply] [BEAM-14129] Restructure PubsubLiteIO Read side to produce smaller


------------------------------------------
[...truncated 57.07 KB...]
> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :sdks:java:expansion-service:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220323160209504433-4100'
 createTime: '2022-03-23T16:02:18.177547Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-23_09_02_15-1936031335567360432'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0323150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-23T16:02:18.177547Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-23_09_02_15-1936031335567360432]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-23_09_02_15-1936031335567360432
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_02_15-1936031335567360432?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-23_09_02_15-1936031335567360432 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:24.405Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.420Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.447Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.512Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.574Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.600Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.633Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.665Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.703Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.761Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.795Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.828Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.036Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.065Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.094Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.115Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.147Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.196Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.232Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.251Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:40.646Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:03:09.882Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:03:34.293Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.568Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.634Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.667Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.704Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.732Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-23_09_02_15-1936031335567360432 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 568be7cce4b34445a3b0f42d17f479b4 and timestamp: 1648052050.1394997:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 110
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220323161415141346-2196'
 createTime: '2022-03-23T16:14:21.354513Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-23_09_14_20-7322242075389462471'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0323150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-23T16:14:21.354513Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-23_09_14_20-7322242075389462471]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-23_09_14_20-7322242075389462471
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_14_20-7322242075389462471?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-23_09_14_20-7322242075389462471 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:27.028Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.797Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.912Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.995Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.088Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.174Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.269Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.370Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.424Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.666Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.691Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.742Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.779Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.818Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.100Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.196Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.235Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.288Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.337Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.411Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.447Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.500Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:54.448Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:06.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:06.652Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:16.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:40.507Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-23_09_14_20-7322242075389462471 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c4256ba0071d4ff79bcee51ff1c9a8c1 and timestamp: 1648052856.3371046:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 108
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c4256ba0071d4ff79bcee51ff1c9a8c1 and timestamp: 1648052856.3371046:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 108
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4b8cd688-8840-49e4-8c80-d4e4c3b20978_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_02_15-1936031335567360432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_14_20-7322242075389462471?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 14s
92 actionable tasks: 73 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ikszch3dtyrne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/651/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14124] Add display data to BQ storage reads.

[noreply] fixes static checks and go lint issues (#17138)

[Kyle Weaver] Don't print in task configuration.

[noreply] [BEAM-14136] Clean up staticcheck and linter warnings in the Go SDK

[noreply] Merge pull request #17063 from [BEAM-12164] Fix flaky tests

[noreply] Revert "[BEAM-14112] Avoid storing a generator in _CustomBigQuerySource

[Kyle Weaver] [BEAM-4106] Remove filesToStage from Flink pipeline option list.

[noreply] [BEAM-14071] Enabling Flink on Dataproc for Interactive Beam (#17044)

[noreply] Minor: Bypass schema registry in schemas_test.py (#17108)


------------------------------------------
[...truncated 55.62 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2695811 sha256=6f7de31194ea5092dc7c9abd6b12437253560c9e4b8083318981db90ba867cf4
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.23 botocore-1.24.23 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220322155043818043-8112'
 createTime: '2022-03-22T15:50:50.676888Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-22_08_50_49-11404046316872429164'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0322150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-22T15:50:50.676888Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-22_08_50_49-11404046316872429164]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-22_08_50_49-11404046316872429164
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_08_50_49-11404046316872429164?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-22_08_50_49-11404046316872429164 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:57.691Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.400Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.425Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.474Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.529Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.555Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.580Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.633Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.658Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.679Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.705Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.731Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.755Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.916Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.940Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.962Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.006Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.035Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.081Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.108Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.134Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:23.648Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:34.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:34.703Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:45.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:52:07.942Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.389Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.435Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.461Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.496Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.526Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-22_08_50_49-11404046316872429164 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 96afde8eaafb4e6a893245141458d337 and timestamp: 1647964963.6769545:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 101
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220322160248694593-6315'
 createTime: '2022-03-22T16:02:56.224659Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-22_09_02_55-11823124561686518281'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0322150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-22T16:02:56.224659Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-22_09_02_55-11823124561686518281]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-22_09_02_55-11823124561686518281
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_09_02_55-11823124561686518281?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-22_09_02_55-11823124561686518281 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:01.569Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.276Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.308Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.375Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.478Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.595Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.656Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.689Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.714Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.746Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.802Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.920Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.358Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.458Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.579Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.613Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.642Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.665Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.698Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.752Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.776Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.821Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:17.847Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:46.049Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:04:11.406Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-22_09_02_55-11823124561686518281 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4b79ebd830c74cd7a2c8a4261156770c and timestamp: 1647965761.5007348:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4b79ebd830c74cd7a2c8a4261156770c and timestamp: 1647965761.5007348:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_08_50_49-11404046316872429164?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_09_02_55-11823124561686518281?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_10dd009b-e529-4e81-b1da-c823909177c8_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3o7i7bqkjygye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/650/display/redirect?page=changes>

Changes:

[mmack] [adhoc] Move aws IT tests to testing package according to best practices


------------------------------------------
[...truncated 55.53 KB...]
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=8b19de0dc4c5f35b39d3bb42949752759993f19af92fb6831c350f5bfa7f3caf
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220321155034334118-9216'
 createTime: '2022-03-21T15:50:40.293803Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-21_08_50_39-7986734118607749467'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0321150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-21T15:50:40.293803Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-21_08_50_39-7986734118607749467]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-21_08_50_39-7986734118607749467
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_08_50_39-7986734118607749467?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-21_08_50_39-7986734118607749467 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.170Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.851Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.883Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.962Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.036Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.064Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.088Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.114Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.150Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.180Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.204Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.226Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.282Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.338Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.370Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.466Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.516Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.560Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.598Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.631Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.691Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.718Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:57.008Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:51:16.953Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:51:47.604Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:52:12.523Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.233Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.273Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.301Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.334Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.357Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-21_08_50_39-7986734118607749467 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a70f98c3857944a3855499dfe4e33d4d and timestamp: 1647878567.1928797:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 107
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220321160253203446-7456'
 createTime: '2022-03-21T16:03:01.180210Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-21_09_03_00-11981826017667924636'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0321150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-21T16:03:01.180210Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-21_09_03_00-11981826017667924636]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-21_09_03_00-11981826017667924636
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_09_03_00-11981826017667924636?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-21_09_03_00-11981826017667924636 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:20.128Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.181Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.214Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.295Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.396Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.430Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.518Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.589Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.633Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.671Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.737Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.797Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.903Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.970Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.005Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.131Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.178Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.241Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.300Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.338Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.413Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.456Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.494Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:50.473Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:04:02.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:04:27.144Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-21_09_03_00-11981826017667924636 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 2e64bd4172a94a6fa41e0be407444f18 and timestamp: 1647879382.1889186:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 2e64bd4172a94a6fa41e0be407444f18 and timestamp: 1647879382.1889186:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_08_50_39-7986734118607749467?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_09_03_00-11981826017667924636?project=apache-beam-testing
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_2a3364bc-3b3d-41ed-a67f-71efb77b53c1_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7ooeytqd3flf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/649/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14122] Upgrade pip-licenses dependency (#17132)


------------------------------------------
[...truncated 55.43 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=b98999392a7be93242af5901805a2c138d4377844bdbf8c8dc143e9a41e78993
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220320155042033408-8178'
 createTime: '2022-03-20T15:50:48.912227Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-20_08_50_48-15406086144577952723'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0320150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-20T15:50:48.912227Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-20_08_50_48-15406086144577952723]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-20_08_50_48-15406086144577952723
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_08_50_48-15406086144577952723?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-20_08_50_48-15406086144577952723 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:54.603Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.525Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.578Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.618Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.652Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.683Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.759Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.800Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.882Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.915Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.988Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.099Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.120Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.152Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.182Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.217Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.267Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.291Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.319Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.010Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.993Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:41.333Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:52:03.591Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:52:03.627Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.788Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.863Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.891Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.930Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.992Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-20_08_50_48-15406086144577952723 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f4c8d84dad2349e0a4c77f6dd09e9550 and timestamp: 1647792171.0019605:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 111
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220320160255512247-8623'
 createTime: '2022-03-20T16:03:01.914625Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-20_09_03_01-14277975848265007722'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0320150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-20T16:03:01.914625Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-20_09_03_01-14277975848265007722]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-20_09_03_01-14277975848265007722
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_09_03_01-14277975848265007722?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-20_09_03_01-14277975848265007722 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:06.727Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.431Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.464Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.529Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.603Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.628Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.695Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.759Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.797Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.825Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.846Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.899Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.953Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.009Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.071Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.208Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.258Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.283Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.329Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.379Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.525Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.585Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.745Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:17.588Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:54.905Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:04:19.177Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:04:19.213Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-20_09_03_01-14277975848265007722 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 411244e296e848b2b8aa6378c1c82ee7 and timestamp: 1647792961.58281:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 411244e296e848b2b8aa6378c1c82ee7 and timestamp: 1647792961.58281:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_08_50_48-15406086144577952723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_09_03_01-14277975848265007722?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_2e599797-d216-4c1a-aa8e-6e8298db4e80_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nvf2xkvdthwjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #648

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/648/display/redirect?page=changes>

Changes:

[Kiley Sok] Add Java 17 Nexmark metrics to Grafana

[yiru] .

[yiru] .

[yiru] .

[yiru] format fix

[yiru] .

[yiru] make DoFn into a separate class

[yiru] .

[yiru] fix setting

[mmack] [BEAM-14125] Update website IO matrix to recommend aws2 IOs

[noreply] [BEAM-14128] Eliminating quadratic behavior of

[noreply] [BEAM-13972] Add RunInference interface (#16917)

[noreply] Merge pull request #17116 from [BEAM-12164] Remove change_stream in

[yiru] fix checkstyle

[yiru] spotlessapply

[noreply] Deprecate tags.go (#17025)

[noreply] [BEAM-12753] and [BEAM-12815] Fix Flink Integration Tests (#17067)

[noreply] Merge pull request #16895 from [BEAM-13882][Playground] More tests for

[noreply] [BEAM-13925] Add weekly automation to update our reviewer config

[noreply] Merge pull request #17076 from Beam 14082 update payground for mobile

[noreply] [BEAM-13925] Assign committers in the scheduled action (#17062)

[noreply] Pin setup-gcloud to v0 instead of master (#17123)

[noreply] [BEAM-3304] documentation for PaneInfo in BPG (#17047)

[noreply] Merge pull request #17016 from [BEAM-14049][Playground] Add new API

[noreply] Merge pull request #17077 from [BEAM-14078] [Website] change link

[noreply] Merge pull request #17085 from [BEAM-14077] [Website] add beam

[noreply] Update Changes.md w/Go pipeline pre-process fix.

[noreply] [BEAM-14098] wrapper for postgres on JDBC IO GO SDK (#17088)

[noreply] Merge pull request #17023 from [BEAM-12164]: Remove child partition


------------------------------------------
[...truncated 55.34 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=4e7bf0b4df5bdbf9be3c5207ecb53bbad5594cacb11adc8aa4941eeac4c9052d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220319155035299183-9062'
 createTime: '2022-03-19T15:50:41.232485Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-19_08_50_40-4949934856950811884'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0319150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-19T15:50:41.232485Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-19_08_50_40-4949934856950811884]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-19_08_50_40-4949934856950811884
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_08_50_40-4949934856950811884?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-19_08_50_40-4949934856950811884 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:46.095Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.026Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.061Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.109Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.149Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.169Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.195Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.229Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.268Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.357Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.381Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.436Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.562Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.597Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.628Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.650Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.679Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.740Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.780Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.813Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:18.880Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:32.187Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:54.843Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:54.892Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.139Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.212Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.303Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.377Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.427Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-19_08_50_40-4949934856950811884 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5ae25cae0d6c4120966f7b1b34daaddd and timestamp: 1647705768.089736:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 119
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220319160252894729-1246'
 createTime: '2022-03-19T16:02:59.405582Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-19_09_02_58-17112221556829961310'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0319150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-19T16:02:59.405582Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-19_09_02_58-17112221556829961310]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-19_09_02_58-17112221556829961310
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_09_02_58-17112221556829961310?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-19_09_02_58-17112221556829961310 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:05.654Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.683Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.704Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.761Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.831Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.860Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.924Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.025Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.069Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.138Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.375Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.407Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.473Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.505Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.577Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.605Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.636Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.681Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.712Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.771Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.801Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.877Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:21.769Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:50.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:04:15.346Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:04:15.368Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-19_09_02_58-17112221556829961310 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 93d4667d58454293aba1219b13b4984e and timestamp: 1647706584.5855942:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 89
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 93d4667d58454293aba1219b13b4984e and timestamp: 1647706584.5855942:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 89
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_08_50_40-4949934856950811884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_09_02_58-17112221556829961310?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_51b9254f-d449-4d9d-a72e-0aa27aee3b8c_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zrugjq26r4u5y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/647/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10212] Clean-up comments, remove rawtypes usage.

[noreply] [BEAM-14112] Avoid storing a generator in _CustomBigQuerySource (#17100)

[noreply] Populate environment capabilities in v1beta3 protos. (#17042)

[Kyle Weaver] [BEAM-12976] Test a whole pipeline using projection pushdown in BQ IO.

[Kyle Weaver] [BEAM-12976] Enable projection pushdown for Java pipelines on Dataflow,

[noreply] [BEAM-14038] Auto-startup for Python expansion service. (#17035)

[Kyle Weaver] [BEAM-14123] Fix typo in hdfsIntegrationTest task name.

[noreply] [BEAM-13893] improved coverage of jobopts package (#17003)

[noreply] Merge pull request #16977 from [BEAM-12164]  Added integration test for

[mmack] [adhoc] Minor cleanup for aws2 tests


------------------------------------------
[...truncated 55.58 KB...]
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=ab5829fc1d38fefd1f6d6bc3cfe6f0a1a9c88d88f8d5755a916e444bf696e70f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.21 botocore-1.24.21 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220318155045400535-5117'
 createTime: '2022-03-18T15:50:54.510892Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-18_08_50_51-1213532163946750817'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0318150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-18T15:50:54.510892Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-18_08_50_51-1213532163946750817]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-18_08_50_51-1213532163946750817
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_08_50_51-1213532163946750817?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-18_08_50_51-1213532163946750817 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:04.459Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.777Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.815Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.880Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.944Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.004Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.044Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.130Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.164Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.199Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.232Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.320Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.357Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.520Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.560Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.591Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.627Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.661Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.760Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.817Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.873Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:34.595Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:46.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:52:11.215Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:52:11.247Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.816Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.955Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.999Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:22.032Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:22.097Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-18_08_50_51-1213532163946750817 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 939eb58af10a487fb999cf943cb9b1fa and timestamp: 1647619381.7494645:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220318160307447095-3228'
 createTime: '2022-03-18T16:03:14.510021Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-18_09_03_13-1508967428110627461'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0318150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-18T16:03:14.510021Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-18_09_03_13-1508967428110627461]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-18_09_03_13-1508967428110627461
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_09_03_13-1508967428110627461?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-18_09_03_13-1508967428110627461 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:20.164Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.219Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.245Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.329Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.418Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.445Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.532Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.602Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.671Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.712Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.747Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.785Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.973Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.020Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.100Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.237Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.272Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.297Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.339Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.389Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.432Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.479Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.532Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.571Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:41.208Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:54.686Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:54.729Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:05.008Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:27.231Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:27.272Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-18_09_03_13-1508967428110627461 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6615e05ff94e4d018d63bd609f4a9215 and timestamp: 1647620237.4886332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 302
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6615e05ff94e4d018d63bd609f4a9215 and timestamp: 1647620237.4886332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 302
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_08_50_51-1213532163946750817?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_09_03_13-1508967428110627461?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_648d3d6e-488c-4e06-b60a-d7e132c815cd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4yvebz6ba443o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org