You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/10 13:01:40 UTC

Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #660

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/660/display/redirect?page=changes>

Changes:

[noreply] Add typescript documentation to the programing guide. (#22137)

[noreply] [Website] Update minimum required Go version for sdk development


------------------------------------------
[...truncated 25.52 KB...]
  Using cached google_api_core-1.31.6-py2.py3-none-any.whl (93 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-auth<3,>=1.18.0->apache-beam==2.41.0.dev0) (63.1.0)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.47.0-py3-none-any.whl (10.0 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.1.0-py3-none-any.whl (14 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.3-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2821134 sha256=cd48f5d5bd092116fd94896a8b2558297b198739048327c8ab44caf25a562769
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.26 botocore-1.27.26 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.6 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-1.7.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.1 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.3 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657458097.405483/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657458097.405483/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657458097.405483/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657458097.405483/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cxhlkooaeof5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_ParDo_Dataflow_Streaming #673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/673/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/672/display/redirect?page=changes>

Changes:

[balazs.nemeth] BEAM-14525 Fix for Protobuf getter/setter method name discovery issue

[balazs.nemeth] BEAM-14525 Added a proto message with the problematic properties to use

[balazs.nemeth] PR CR: updating issue links

[noreply] added olehborysevych as collaborator (#22391)

[noreply] Add accept-language header for MPL license (#22395)

[noreply] Bump terser from 5.9.0 to 5.14.2 in

[noreply] Fixes #22156: Fix Spark3 runner to compile against Spark 3.2/3.3 and add


------------------------------------------
[...truncated 25.78 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827581 sha256=ecde9531900531843f0604be3aa5fe2c6e532ed8c28c7ac2cf907f565ca37bff
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.35 botocore-1.27.35 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.3 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.8.0 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.8 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0722125510.1658494896.071805/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0722125510.1658494896.071805/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0722125510.1658494896.071805/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0722125510.1658494896.071805/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/snfj6go7y7ia2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/671/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Support combiner lifting.

[noreply] Bump google.golang.org/api from 0.87.0 to 0.88.0 in /sdks (#22350)

[Robert Bradshaw] More clarification.

[noreply] [CdapIO] HasOffset interface was implemented (#22193)


------------------------------------------
[...truncated 25.71 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827581 sha256=b57bbe7eddbb0100a1b8443901836415c2abf8c7ed33877c71da42b19627d09a
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.34 botocore-1.27.34 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.3 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.8.0 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.8 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0721125454.1658408494.353197/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0721125454.1658408494.353197/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0721125454.1658408494.353197/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0721125454.1658408494.353197/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 10s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/x7wmownpvsips

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/670/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Require unique names for stages.

[noreply] Add links to the new RunInference content to Learning Resources (#22325)

[noreply] Unskip RunInference IT tests (#22324)

[noreply] cleaned up types in standard_coders.ts (#22316)

[noreply] JMH module for sdks:java:core with benchmarks for

[noreply] Bump cloud.google.com/go/pubsub from 1.23.1 to 1.24.0 in /sdks (#22332)

[Luke Cwik] [#22181] Fix java package for SDK java core benchmark

[Luke Cwik] Allow jmhTest to run concurrently with other jmhTest instances

[noreply] [BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)


------------------------------------------
[...truncated 25.77 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827581 sha256=8153d2e4e776d6d31e515314c33ceed3e57713d01d99fe55814008eaf3c33815
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.33 botocore-1.27.33 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.3 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.8.0 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.8 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0720125506.1658322199.025465/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0720125506.1658322199.025465/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0720125506.1658322199.025465/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0720125506.1658322199.025465/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 31s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ofu56x3ni24ku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/669/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14117] Unvendor bytebuddy dependency (#17317)

[noreply] Use npm ci instead of install in CI (#22323)

[noreply] Fix typo in use_single_core_per_container logic. (#22318)

[noreply] [#22319] Regenerate proto2_coder_test_messages_pb2.py manually (#22320)


------------------------------------------
[...truncated 25.75 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827505 sha256=6b826949031ee16ef4b8340c9fd362b10eb373823690c0867568ae084a636190
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.32 botocore-1.27.32 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.8.0 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0719125454.1658235705.222116/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0719125454.1658235705.222116/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0719125454.1658235705.222116/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0719125454.1658235705.222116/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gbagkfmqhb5fe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/668/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [website] Add TPC-DS benchmark documentation

[noreply] Increase streaming server timeout  (#22280)


------------------------------------------
[...truncated 25.73 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827423 sha256=2c4590f5aa0ca6e2512b3fa2bedf7973aa29d6d3b94f7890b6045ee816f3f476
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.31 botocore-1.27.31 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0718125501.1658149302.414888/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0718125501.1658149302.414888/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0718125501.1658149302.414888/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0718125501.1658149302.414888/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 17s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/obda5gf2ngwbe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/667/display/redirect?page=changes>

Changes:

[vlad.matyunin] enabled multifile flag for multifile examples (PG)

[noreply] Merge pull request #22300 from Fixed [Playground] DeployExamples,


------------------------------------------
[...truncated 25.72 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827423 sha256=b9d38b0946b59a9669f3ff40d4ff3b6b5201c90a765d7b944992f42f8371e130
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.31 botocore-1.27.31 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.4 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0717125508.1658062905.854246/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0717125508.1658062905.854246/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0717125508.1658062905.854246/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0717125508.1658062905.854246/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 23s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tab32zqq3xt4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/666/display/redirect?page=changes>

Changes:

[noreply] Bump protobufjs from 6.11.2 to 6.11.3 in /sdks/typescript

[egalpin] Moves timestamp skew override to correct place

[egalpin] Adds TestStream to verify window preservation of ElasticsearchIO#write

[egalpin] Removes unnecessary line

[egalpin] Adds validation that ES#Write outputs are in expected windows

[egalpin] Updates window verification test to assert the exact docs in the window

[egalpin] Uses guava Iterables over shaded avro version

[Robert Bradshaw] Don't try to parse non-flags as retained pipeline options.

[chamikaramj] Enables UnboundedSource wrapped SDF Kafka source by default for x-lang

[noreply] Merge pull request #22140 from [Playground Task] Sharing any code API

[bulat.safiullin] [Website] add playground section, update playground, update get-started

[noreply] RunInference documentation updates. (#22236)

[noreply] Turn pr bot on for remaining common labels (#22257)

[noreply] Reviewing the RunInference ReadMe file for clarity. (#22069)

[noreply] Collect heap profile on OOM on Dataflow (#22225)

[noreply] fixing the missing wrap around ring range read (#21786)

[noreply] Update RunInference documentation (#22250)

[noreply] Rewrote Java multi-language pipeline quickstart (#22263)


------------------------------------------
[...truncated 25.74 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2827423 sha256=1c3a8ee73b43fa7f41d6032ad532628a62b98aa5388944eab1cf73bb3ad61b21
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.31 botocore-1.27.31 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.3 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0716125502.1657976496.572470/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0716125502.1657976496.572470/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0716125502.1657976496.572470/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0716125502.1657976496.572470/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dw3mi5nywg57o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/665/display/redirect?page=changes>

Changes:

[vitaly.terentyev] [BEAM-14101] Add Spark Receiver IO package and ReceiverBuilder

[Heejong Lee] [BEAM-22229] Override external SDK container URLs for Dataflow by

[danthev] Fix query retry in Java FirestoreIO.

[noreply] Split words on new lines or spaces (#22270)

[noreply] Replace \r\n, not just \n

[noreply] Pg auth test (#22277)

[noreply] [BEAM-14073] [CdapIO] CDAP IO for batch plugins: Read, Write. Unit tests

[Heejong Lee] update

[noreply] [Fix #22151] Add fhirio.Deidentify transform (#22152)

[noreply] Remove locks around ExecutionStateSampler (#22190)


------------------------------------------
[...truncated 25.69 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2823479 sha256=32664c16143566122583879a533fbc452aed764b37ee7b790f6612d7b2a707fc
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.30 botocore-1.27.30 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.3 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0715125513.1657890090.759368/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0715125513.1657890090.759368/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0715125513.1657890090.759368/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0715125513.1657890090.759368/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 5s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jxfdk55uxw35i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/664/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14506] Adding testcases and examples for xlang Python RunInference

[Heejong Lee] update

[Heejong Lee] update

[noreply] Move Go Primitives Integration Tests to Generic Registration (#22247)

[noreply] Move native Go examples to generic registration (#22245)

[noreply] Move youngoli to the reviewer exclusion list (#22195)

[noreply] Bump google.golang.org/api from 0.86.0 to 0.87.0 in /sdks (#22253)

[noreply] Bump cloud.google.com/go/bigquery from 1.34.1 to 1.35.0 in /sdks

[noreply] Bump google.golang.org/grpc from 1.47.0 to 1.48.0 in /sdks (#22252)

[noreply] Merge pull request #15786: Add gap-filling transform for timeseries

[chamikaramj] Adds an experiment that allows opting into using Kafka SDF-wrapper

[noreply] Defocus iframe on blur or mouseout (#22153) (#22154)

[noreply] Fix pydoc rendering for annotated classes (#22121)

[noreply] Fix typo in comment (#22266)


------------------------------------------
[...truncated 25.63 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc,grpcgcp] to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-spanner<2,>=1.13.0->apache-beam==2.41.0.dev0) (63.2.0)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2823479 sha256=b03da458a6c17330016f7d6b7f4ce0ed5828dc1c5f2c016cb0e414f9ef404804
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.29 botocore-1.27.29 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.3 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0714125503.1657803692.828921/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0714125503.1657803692.828921/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0714125503.1657803692.828921/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0714125503.1657803692.828921/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vntnpm3wcstzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/663/display/redirect?page=changes>

Changes:

[naireenhussain] add new pubsub urn

[Pablo Estrada] Several requests to show experiments in Dataflow UI

[byronellis] Add org.pentaho to calcite relocated packages to fix vendoring

[noreply] Adding VladMatyunin as collaborator (#22239)

[noreply] Mark session runner as deprecated (#22242)

[noreply] Update google-cloud-core dependency to <3 (#22237)

[noreply] Move WC integration test to generic registration (#22248)

[noreply] Move Xlang Go examples to generic registration (#22249)


------------------------------------------
[...truncated 65.81 KB...]
  Using cached joblib-1.0.1-py3-none-any.whl (303 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of orjson to determine which version is compatible with other requirements. This could take a while.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of httplib2 to determine which version is compatible with other requirements. This could take a while.
Collecting httplib2<0.21.0,>=0.8
  Using cached httplib2-0.20.2-py3-none-any.whl (96 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of numpy to determine which version is compatible with other requirements. This could take a while.
  Using cached httplib2-0.20.1-py3-none-any.whl (96 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of hdfs to determine which version is compatible with other requirements. This could take a while.
Collecting hdfs<3.0.0,>=2.1.0
  Using cached hdfs-2.6.0-py3-none-any.whl (33 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of mock to determine which version is compatible with other requirements. This could take a while.
  Using cached hdfs-2.5.8.tar.gz (41 kB)
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of grpcio-gcp to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of grpcio to determine which version is compatible with other requirements. This could take a while.
Collecting grpcio<2,>=1.33.1
  Using cached grpcio-1.47.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of joblib to determine which version is compatible with other requirements. This could take a while.
  Using cached grpcio-1.46.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.4 MB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-cloud-vision to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-vision<2,>=0.38.0
  Using cached google_cloud_vision-1.0.1-py2.py3-none-any.whl (435 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
  Using cached google_cloud_vision-1.0.0-py2.py3-none-any.whl (435 kB)
  Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
  Using cached google_cloud_vision-0.41.0-py2.py3-none-any.whl (431 kB)
  Using cached google_cloud_vision-0.40.0-py2.py3-none-any.whl (431 kB)
  Using cached google_cloud_vision-0.39.0-py2.py3-none-any.whl (418 kB)
  Using cached google_cloud_vision-0.38.1-py2.py3-none-any.whl (413 kB)
  Using cached google_cloud_vision-0.38.0-py2.py3-none-any.whl (413 kB)
INFO: pip is looking at multiple versions of google-cloud-videointelligence to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-videointelligence<2,>=1.8.0
  Using cached google_cloud_videointelligence-1.16.2-py2.py3-none-any.whl (183 kB)
INFO: pip is looking at multiple versions of google-cloud-spanner to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-spanner<2,>=1.13.0
  Using cached google_cloud_spanner-1.19.2-py2.py3-none-any.whl (255 kB)
INFO: pip is looking at multiple versions of google-cloud-recommendations-ai to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of google-cloud-pubsublite to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-pubsublite<2,>=1.2.0
  Using cached google_cloud_pubsublite-1.4.1-py2.py3-none-any.whl (265 kB)
INFO: pip is looking at multiple versions of google-cloud-pubsub to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-pubsub<3,>=2.1.0
  Using cached google_cloud_pubsub-2.13.1-py2.py3-none-any.whl (234 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.1)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2822936 sha256=e5d7abbd0023dfe12b0b74af739fee452e83553514ee172227f1472629268cde
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.28 botocore-1.27.28 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.6 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.1 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0713125503.1657717377.487264/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0713125503.1657717377.487264/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0713125503.1657717377.487264/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0713125503.1657717377.487264/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 29s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wsdw27oen6gqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_ParDo_Dataflow_Streaming - Build # 662 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Python_ParDo_Dataflow_Streaming - Build # 662 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/662/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_ParDo_Dataflow_Streaming #661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/661/display/redirect?page=changes>

Changes:

[noreply] Parallelizable DataFrame/Series mean (#22174)


------------------------------------------
[...truncated 25.47 KB...]
  Using cached google_api_core-1.31.6-py2.py3-none-any.whl (93 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-auth<3,>=1.18.0->apache-beam==2.41.0.dev0) (63.1.0)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.47.0-py3-none-any.whl (10.0 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.1.0-py3-none-any.whl (14 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from httplib2<0.21.0,>=0.8->apache-beam==2.41.0.dev0) (3.0.9)
Collecting pbr>=0.11
  Using cached pbr-5.9.0-py2.py3-none-any.whl (112 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.1-py2.py3-none-any.whl
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (2.1.3)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.13.0-py3-none-any.whl (51 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (1.11.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting charset-normalizer<3,>=2
  Using cached charset_normalizer-2.1.0-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.10-py2.py3-none-any.whl (139 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker>=4.0.0
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.3-py3-none-any.whl (54 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.3-py2.py3-none-any.whl (211 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.8.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.8.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.3-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.2-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2821181 sha256=e3839fe61260178e86c915286b9ee6ab633c3bdf0a8b9a1cd096614effd34de9
  Stored in directory: /home/jenkins/.cache/pip/wheels/4d/28/a8/75e525f0f56ebf0cac86293ab90763d6a2a3105a27bb3ba779
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, overrides, oauth2client, mock, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.26 botocore-1.27.26 cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.6 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-core-1.7.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.1 google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 googleapis-common-protos-1.56.3 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657544496.127813/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657544496.127813/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657544496.127813/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0706185453.1657544496.127813/pipeline.pb in 0 seconds.
usage: pardo_test.py [-h] [--runner RUNNER] [--streaming]
                     [--resource_hint RESOURCE_HINTS]
                     [--beam_services BEAM_SERVICES]
                     [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                     [--type_check_additional TYPE_CHECK_ADDITIONAL]
                     [--no_pipeline_type_check] [--runtime_type_check]
                     [--performance_runtime_type_check]
                     [--allow_non_deterministic_key_coders]
                     [--allow_unsafe_triggers]
                     [--no_direct_runner_use_stacked_bundle]
                     [--direct_runner_bundle_repeat DIRECT_RUNNER_BUNDLE_REPEAT]
                     [--direct_num_****s DIRECT_NUM_WORKERS]
                     [--direct_running_mode {in_memory,multi_threading,multi_processing}]
                     [--direct_embed_docker_python]
                     [--dataflow_endpoint DATAFLOW_ENDPOINT]
                     [--project PROJECT] [--job_name JOB_NAME]
                     [--staging_location STAGING_LOCATION]
                     [--temp_location TEMP_LOCATION] [--region REGION]
                     [--service_account_email SERVICE_ACCOUNT_EMAIL]
                     [--no_auth] [--template_location TEMPLATE_LOCATION]
                     [--label LABELS] [--update]
                     [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                     [--enable_streaming_engine]
                     [--dataflow_kms_key DATAFLOW_KMS_KEY]
                     [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                     [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                     [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                     [--enable_hot_key_logging] [--enable_artifact_caching]
                     [--impersonate_service_account IMPERSONATE_SERVICE_ACCOUNT]
                     [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                     [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                     [--num_****s NUM_WORKERS]
                     [--max_num_****s MAX_NUM_WORKERS]
                     [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                     [--****_machine_type MACHINE_TYPE]
                     [--disk_size_gb DISK_SIZE_GB]
                     [--****_disk_type DISK_TYPE]
                     [--****_region WORKER_REGION]
                     [--****_zone WORKER_ZONE] [--zone ZONE]
                     [--network NETWORK] [--subnetwork SUBNETWORK]
                     [--****_harness_container_image WORKER_HARNESS_CONTAINER_IMAGE]
                     [--sdk_container_image SDK_CONTAINER_IMAGE]
                     [--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                     [--default_sdk_harness_log_level DEFAULT_SDK_HARNESS_LOG_LEVEL]
                     [--sdk_harness_log_level_overrides SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                     [--use_public_ips] [--no_use_public_ips]
                     [--min_cpu_platform MIN_CPU_PLATFORM]
                     [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                     [--dataflow_job_file DATAFLOW_JOB_FILE]
                     [--experiment EXPERIMENTS]
                     [--number_of_****_harness_threads NUMBER_OF_WORKER_HARNESS_THREADS]
                     [--profile_cpu] [--profile_memory]
                     [--profile_location PROFILE_LOCATION]
                     [--profile_sample_rate PROFILE_SAMPLE_RATE]
                     [--requirements_file REQUIREMENTS_FILE]
                     [--requirements_cache REQUIREMENTS_CACHE]
                     [--requirements_cache_only_sources]
                     [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                     [--pickle_library {cloudpickle,default,dill}]
                     [--save_main_session] [--sdk_location SDK_LOCATION]
                     [--extra_package EXTRA_PACKAGES]
                     [--prebuild_sdk_container_engine PREBUILD_SDK_CONTAINER_ENGINE]
                     [--prebuild_sdk_container_base_image PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                     [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                     [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                     [--job_endpoint JOB_ENDPOINT]
                     [--artifact_endpoint ARTIFACT_ENDPOINT]
                     [--job_server_timeout JOB_SERVER_TIMEOUT]
                     [--environment_type ENVIRONMENT_TYPE]
                     [--environment_config ENVIRONMENT_CONFIG]
                     [--environment_option ENVIRONMENT_OPTIONS]
                     [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                     [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                     [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                     [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                     [--artifact_port ARTIFACT_PORT]
                     [--expansion_port EXPANSION_PORT]
                     [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                     [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                     [--flink_master FLINK_MASTER]
                     [--flink_version {1.12,1.13,1.14,1.15}]
                     [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                     [--flink_submit_uber_jar]
                     [--spark_master_url SPARK_MASTER_URL]
                     [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                     [--spark_submit_uber_jar]
                     [--spark_rest_url SPARK_REST_URL] [--spark_version {2,3}]
                     [--on_success_matcher ON_SUCCESS_MATCHER]
                     [--dry_run DRY_RUN]
                     [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                     [--pubsub_root_url PUBSUBROOTURL]
                     [--s3_access_key_id S3_ACCESS_KEY_ID]
                     [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                     [--s3_session_token S3_SESSION_TOKEN]
                     [--s3_endpoint_url S3_ENDPOINT_URL]
                     [--s3_region_name S3_REGION_NAME]
                     [--s3_api_version S3_API_VERSION] [--s3_verify S3_VERIFY]
                     [--s3_disable_ssl]
                     [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                     [--metrics_dataset METRICS_DATASET]
                     [--metrics_table METRICS_TABLE]
                     [--influx_measurement INFLUX_MEASUREMENT]
                     [--influx_db_name INFLUX_DB_NAME]
                     [--influx_hostname INFLUX_HOSTNAME]
                     [--input_options INPUT_OPTIONS] [--timeout_ms TIMEOUT_MS]
                     [--iterations ITERATIONS]
                     [--number_of_counter_operations NUMBER_OF_COUNTER_OPERATIONS]
                     [--number_of_counters=0 NUMBER_OF_COUNTERS=0]
pardo_test.py: error: argument --number_of_counters=0: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yt4xikjcssg7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org