You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/10/19 06:38:03 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Spark #2899

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2899/display/redirect>

Changes:


------------------------------------------
[...truncated 82.19 KB...]

> Task :sdks:python:sdist
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages/setuptools/dist.py>:487: UserWarning: Normalizing '2.35.0.dev' to '2.35.0.dev0'
  warnings.warn(tmpl.format(**locals()))
INFO:gen_protos:Regenerating Python proto definitions (no output files).
INFO:gen_protos:Found protoc_gen_mypy at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/bin/protoc-gen-mypy>
beam_interactive_api.proto:36:1: warning: Import google/protobuf/timestamp.proto is unused.
Writing mypy to external_transforms_pb2.pyi
Writing mypy to beam_runner_api_pb2.pyi
Writing mypy to standard_window_fns_pb2.pyi
Writing mypy to metrics_pb2.pyi
Writing mypy to endpoints_pb2.pyi
Writing mypy to schema_pb2.pyi
Writing mypy to beam_artifact_api_pb2.pyi
Writing mypy to beam_expansion_api_pb2.pyi
Writing mypy to beam_job_api_pb2.pyi
Writing mypy to beam_provision_api_pb2.pyi
Writing mypy to beam_fn_api_pb2.pyi
Writing mypy to beam_interactive_api_pb2.pyi
RefactoringTool: Skipping optional fixer: idioms
RefactoringTool: Skipping optional fixer: ws_comma
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>

> Task :sdks:java:container:java8:copyJavaThirdPartyLicenses

> Task :sdks:python:sdist
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: no files found matching 'LICENSE.python'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:container:py36:copyDockerfileDependencies
> Task :sdks:python:installGcpTest

> Task :release:go-licenses:go:dockerRun
+ go-licenses save github.com/apache/beam/sdks/go/container --save_path=/output/licenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses

> Task :release:go-licenses:py:dockerRun
+ go-licenses save github.com/apache/beam/sdks/python/container --save_path=/output/licenses
+ tee /output/licenses/list.csv
+ go-licenses csv github.com/apache/beam/sdks/python/container

> Task :runners:spark:2:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:2:classes

> Task :release:go-licenses:go:dockerRun
+ go-licenses csv github.com/apache/beam/sdks/go/container
+ tee /output/licenses/list.csv
+ chmod -R a+w /output/licenses

> Task :runners:spark:2:jar
> Task :runners:spark:2:job-server:compileJava NO-SOURCE
> Task :runners:spark:2:job-server:classes UP-TO-DATE

> Task :release:go-licenses:py:dockerRun
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:go:createLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
+ chmod -R a+w /output/licenses

> Task :sdks:go:container:copyGolangLicenses
> Task :sdks:go:container:dockerPrepare
> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java8:copyGolangLicenses
> Task :release:go-licenses:py:createLicenses
> Task :sdks:python:container:py36:copyGolangLicenses
> Task :sdks:java:container:java8:dockerPrepare
> Task :sdks:python:container:py36:dockerPrepare

> Task :runners:spark:2:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:2:testClasses
> Task :runners:spark:2:testJar
> Task :runners:spark:2:job-server:shadowJar
> Task :sdks:go:container:docker
> Task :sdks:java:container:java8:docker
> Task :sdks:python:container:py36:docker
Collecting pymongo==3.10.1
  Downloading pymongo-3.10.1-cp36-cp36m-manylinux2014_x86_64.whl (460 kB)
Collecting pytz==2020.1
  Downloading pytz-2020.1-py2.py3-none-any.whl (510 kB)
Collecting pyyaml==5.4
  Downloading PyYAML-5.4-cp36-cp36m-manylinux1_x86_64.whl (640 kB)
Collecting typing-extensions==3.7.4.3
  Downloading typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)
Collecting google-auth==1.31.0
  Downloading google_auth-1.31.0-py2.py3-none-any.whl (147 kB)
Collecting google-api-core==1.22.2
  Downloading google_api_core-1.22.2-py2.py3-none-any.whl (91 kB)
Collecting google-apitools==0.5.31
  Downloading google-apitools-0.5.31.tar.gz (173 kB)
Collecting google-cloud-pubsub==1.0.2
  Downloading google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB)
Collecting google-cloud-bigquery==1.26.1
  Downloading google_cloud_bigquery-1.26.1-py2.py3-none-any.whl (170 kB)
Collecting google-cloud-bigtable==1.0.0
  Downloading google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-core==1.4.1
  Downloading google_cloud_core-1.4.1-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-datastore==1.15.3
  Downloading google_cloud_datastore-1.15.3-py2.py3-none-any.whl (134 kB)
Collecting google-cloud-dlp==0.13.0
  Downloading google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-language==1.3.0
  Downloading google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-profiler==3.0.4
  Downloading google-cloud-profiler-3.0.4.tar.gz (32 kB)
Collecting google-cloud-recommendations-ai==0.2.0
  Downloading google_cloud_recommendations_ai-0.2.0-py2.py3-none-any.whl (180 kB)
Collecting google-cloud-spanner==1.13.0
  Downloading google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting google-cloud-videointelligence==1.13.0
  Downloading google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB)
Collecting google-cloud-vision==0.42.0
  Downloading google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Collecting google-python-cloud-debugger==2.15
  Downloading google_python_cloud_debugger-2.15-cp36-cp36m-manylinux1_x86_64.whl (767 kB)
Collecting grpcio-gcp==0.2.2
  Downloading grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting beautifulsoup4==4.9.1
  Downloading beautifulsoup4-4.9.1-py3-none-any.whl (115 kB)
Collecting bs4==0.0.1
  Downloading bs4-0.0.1.tar.gz (1.1 kB)
Collecting cython==0.29.21
  Downloading Cython-0.29.21-cp36-cp36m-manylinux1_x86_64.whl (2.0 MB)
Collecting cachetools==3.1.1
  Downloading cachetools-3.1.1-py2.py3-none-any.whl (11 kB)
Collecting dataclasses==0.8
  Downloading dataclasses-0.8-py3-none-any.whl (19 kB)
Collecting guppy3==3.0.10
  Downloading guppy3-3.0.10-cp36-cp36m-manylinux2010_x86_64.whl (597 kB)
Collecting mmh3==2.5.1
  Downloading mmh3-2.5.1.tar.gz (9.8 kB)
Collecting orjson==3.6.1
  Downloading orjson-3.6.1-cp36-cp36m-manylinux_2_24_x86_64.whl (233 kB)
Collecting python-dateutil==2.8.1
  Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting requests==2.24.0
  Downloading requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting freezegun==0.3.15
  Downloading freezegun-0.3.15-py2.py3-none-any.whl (14 kB)
Collecting pillow==7.2.0
  Downloading Pillow-7.2.0-cp36-cp36m-manylinux1_x86_64.whl (2.2 MB)
Collecting python-snappy==0.5.4
  Downloading python-snappy-0.5.4.tar.gz (21 kB)

> Task :runners:spark:2:job-server:sparkJobServerSetup
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerSetup

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava
Oct 19, 2021 6:17:02 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO: Registering external transforms: [beam:external:java:kafkaio:externalwithmetadata:v1, beam:external:java:kafkaio:typedwithoutmetadata:v1, beam:external:java:kafka:write:v1, beam:external:java:generate_sequence:v1]
./run_validatesrunner_tests.sh: line 399: go: command not found

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:2:job-server:sparkJobServerCleanup

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/test/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 127

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 1s
216 actionable tasks: 158 executed, 51 from cache, 7 up-to-date

Publishing build scan...
https://gradle.com/s/l7wm42sqmhpnk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Spark #2902

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2902/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #2901

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2901/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-13042] Prevent unexpected blocking in

[ilya.kozyrev] Implement initial server with grpcweb wrapper

[noreply] [BEAM-13068] Add a SQL API in Beam Go SDK (#15746)


------------------------------------------
[...truncated 479.16 KB...]
tensorboard==2.7.0
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.0
tensorflow==2.6.0
tensorflow-estimator==2.6.0
termcolor==1.1.0
threadpoolctl==3.0.0
tqdm==4.62.3
typing-extensions==3.7.4.3
typing-inspect==0.7.1
uritemplate==4.1.1
urllib3==1.25.11
wcwidth==0.2.5
Werkzeug==2.0.2
wheel==0.37.0
wrapt==1.12.1
zipp==3.6.0
Removing intermediate container 18dd68a814cd
 ---> 082114988690
Step 22/27 : RUN pip check
 ---> Running in 88d294901977
No broken requirements found.
Removing intermediate container 88d294901977
 ---> 9d639cea20f8
Step 23/27 : COPY target/LICENSE /opt/apache/beam/
 ---> a12a8fc4b355
Step 24/27 : COPY target/LICENSE.python /opt/apache/beam/
 ---> 5611d1b6f0aa
Step 25/27 : COPY target/NOTICE /opt/apache/beam/
 ---> e58b67f8ec84
Step 26/27 : ADD target/launcher/linux_amd64/boot /opt/apache/beam/
 ---> cc7811f6d8e7
Step 27/27 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 3abb5b0a0246
Removing intermediate container 3abb5b0a0246
 ---> aa8fd4296a02
Successfully built aa8fd4296a02
Successfully tagged apache/beam_python3.6_sdk:2.35.0.dev

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerSetup
Launching Java expansion service @ 34461
Launching Python expansion service @ 37127

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava
$ TESTS=./test/integration/... ./test/regression
$ RUNNER=portable
$ TIMEOUT=1h
$ SIMULTANEOUS=3
$ GCS_LOCATION=gs://temp-storage-for-end-to-end-tests
$ PROJECT=apache-beam-testing
$ DATAFLOW_PROJECT=apache-beam-testing
$ REGION=us-central1
$ trap exit_background_processes SIGINT SIGTERM EXIT
$ key=--io_expansion_jar
$ case --io_expansion_jar in
$ IO_EXPANSION_JAR=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/io/expansion-service/build/libs/beam-sdks-java-io-expansion-service-2.35.0-SNAPSHOT.jar>
$ key=--pipeline_opts
$ case --pipeline_opts in
$ PIPELINE_OPTS=--kafka_jar=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/testing/kafka-service/build/libs/beam-sdks-java-testing-kafka-service-testKafkaService-2.35.0-SNAPSHOT.jar>
$ key=--test_expansion_addr
$ case --test_expansion_addr in
$ TEST_EXPANSION_ADDR=localhost:34461
$ key=--runner
$ case --runner in
$ RUNNER=spark
$ key=--tests
$ case --tests in
$ TESTS=./test/integration/xlang ./test/integration/io/xlang/...
$ key=--endpoint
$ case --endpoint in
$ ENDPOINT=localhost:35409
$ cd <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src>
$ test -d sdks/go/test
$ EXPANSION_PORT=60775
$ IO_EXPANSION_ADDR=localhost:35073
No IO expansion address specified; starting a new IO expansion server on localhost:35073
$ java -jar <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/io/expansion-service/build/libs/beam-sdks-java-io-expansion-service-2.35.0-SNAPSHOT.jar> 35073
$ IO_EXPANSION_PID=32374
$ SIMULTANEOUS=1
$ TAG=dev
$ ./gradlew :sdks:go:container:docker -Pdocker-tag=dev
Starting expansion service at localhost:35073
Starting a Gradle Daemon, 4 busy Daemons could not be reused, use --status for details
	beam:external:java:kafkaio:externalwithmetadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@3d36e4cdOct 19, 2021 6:18:30 PM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO: Registering external transforms: [beam:external:java:kafkaio:externalwithmetadata:v1, beam:external:java:kafkaio:typedwithoutmetadata:v1, beam:external:java:kafka:write:v1, beam:external:java:generate_sequence:v1]

	beam:external:java:kafkaio:typedwithoutmetadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@6a472554
	beam:external:java:kafka:write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@7ff2a664
	beam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@525b461a
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:go:container:copyLicenses UP-TO-DATE
> Task :sdks:go:container:dockerClean

> Task :sdks:go:container:goPrepare
Use project GOPATH: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/container/.gogradle/project_gopath>

> Task :sdks:go:container:resolveBuildDependencies SKIPPED
> Task :sdks:go:container:installDependencies SKIPPED
> Task :sdks:go:container:buildLinuxAmd64 UP-TO-DATE
> Task :sdks:go:container:goBuild
> Task :sdks:go:container:skipPullLicenses
> Task :sdks:go:container:dockerPrepare
> Task :sdks:go:container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 43s
8 actionable tasks: 6 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mohbunj4btmry

$ CONTAINER=apache/beam_go_sdk
$ cd sdks/go
>>> RUNNING spark integration tests with pipeline options:  -p 1 -timeout 1h --runner=spark --project=apache-beam-testing --region=us-central1 --environment_type=DOCKER --environment_config=apache/beam_go_sdk:dev --staging_location=gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test --temp_location=gs://temp-storage-for-end-to-end-tests/temp-validatesrunner-test --dataflow_worker_jar= --endpoint=localhost:35409 --test_expansion_addr=localhost:34461 --io_expansion_addr=localhost:35073 --kafka_jar=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/testing/kafka-service/build/libs/beam-sdks-java-testing-kafka-service-testKafkaService-2.35.0-SNAPSHOT.jar>
$ 
./run_validatesrunner_tests.sh: line 399: go: command not found
$ TEST_EXIT_CODE=0
$ cd ../..
$ exit 127
$ exit 127

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:35409 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_transforms
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5084 deselected / 10 selected

apache_beam/io/external/generate_sequence_test.py ...                    [ 30%]
apache_beam/transforms/validate_runner_xlang_test.py .......             [100%]

=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
=========== 10 passed, 5084 deselected, 2 warnings in 219.80 seconds ===========

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:35409 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_transforms
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5084 deselected / 10 selected

apache_beam/io/external/generate_sequence_test.py .ss                    [ 30%]
apache_beam/transforms/validate_runner_xlang_test.py .......             [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
============ 8 passed, 2 skipped, 5084 deselected in 126.66 seconds ============

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:35409 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_sql_expansion_service
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5085 deselected / 9 selected

apache_beam/transforms/sql_test.py .........                             [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml> -
================= 9 passed, 5085 deselected in 224.38 seconds ==================

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 32212.
Stopping expansion service pid: 32218.

> Task :runners:spark:2:job-server:sparkJobServerCleanup
Stopping job server pid: 10027.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/test/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 127

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 45m 53s
216 actionable tasks: 154 executed, 55 from cache, 7 up-to-date

Publishing build scan...
https://gradle.com/s/urkfyr6fiecb6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #2900

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2900/display/redirect>

Changes:


------------------------------------------
[...truncated 511.64 KB...]
tensorboard==2.7.0
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.0
tensorflow==2.6.0
tensorflow-estimator==2.6.0
termcolor==1.1.0
threadpoolctl==3.0.0
tqdm==4.62.3
typing-extensions==3.7.4.3
typing-inspect==0.7.1
uritemplate==4.1.1
urllib3==1.25.11
wcwidth==0.2.5
Werkzeug==2.0.2
wheel==0.37.0
wrapt==1.12.1
zipp==3.6.0
Removing intermediate container be90f5bb8ed4
 ---> 71fc969fdcc3
Step 22/27 : RUN pip check
 ---> Running in bb3c781dec5c
No broken requirements found.
Removing intermediate container bb3c781dec5c
 ---> aa1929c9cfa7
Step 23/27 : COPY target/LICENSE /opt/apache/beam/
 ---> 76e95c657d10
Step 24/27 : COPY target/LICENSE.python /opt/apache/beam/
 ---> f26888cdc4ed
Step 25/27 : COPY target/NOTICE /opt/apache/beam/
 ---> 3fdb95337a0b
Step 26/27 : ADD target/launcher/linux_amd64/boot /opt/apache/beam/
 ---> 17eeb161ffa6
Step 27/27 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 5b2df55ae99d
Removing intermediate container 5b2df55ae99d
 ---> e2a0a6560f36
Successfully built e2a0a6560f36
Successfully tagged apache/beam_python3.6_sdk:2.35.0.dev

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerSetup
Launching Java expansion service @ 33089
Launching Python expansion service @ 39703

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava
$ TESTS=./test/integration/... ./test/regression
$ RUNNER=portable
$ TIMEOUT=1h
$ SIMULTANEOUS=3
$ GCS_LOCATION=gs://temp-storage-for-end-to-end-tests
$ PROJECT=apache-beam-testing
$ DATAFLOW_PROJECT=apache-beam-testing
$ REGION=us-central1
$ trap exit_background_processes SIGINT SIGTERM EXIT
$ key=--io_expansion_jar
$ case --io_expansion_jar in
$ IO_EXPANSION_JAR=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/io/expansion-service/build/libs/beam-sdks-java-io-expansion-service-2.35.0-SNAPSHOT.jar>
$ key=--pipeline_opts
$ case --pipeline_opts in
$ PIPELINE_OPTS=--kafka_jar=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/testing/kafka-service/build/libs/beam-sdks-java-testing-kafka-service-testKafkaService-2.35.0-SNAPSHOT.jar>
$ key=--test_expansion_addr
$ case --test_expansion_addr in
$ TEST_EXPANSION_ADDR=localhost:33089
$ key=--runner
$ case --runner in
$ RUNNER=spark
$ key=--tests
$ case --tests in
$ TESTS=./test/integration/xlang ./test/integration/io/xlang/...
$ key=--endpoint
$ case --endpoint in
$ ENDPOINT=localhost:41983
$ cd <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src>
$ test -d sdks/go/test
$ EXPANSION_PORT=42235
$ IO_EXPANSION_ADDR=localhost:57767
No IO expansion address specified; starting a new IO expansion server on localhost:57767
$ java -jar <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/io/expansion-service/build/libs/beam-sdks-java-io-expansion-service-2.35.0-SNAPSHOT.jar> 57767
$ IO_EXPANSION_PID=32299
$ SIMULTANEOUS=1
$ TAG=dev
$ ./gradlew :sdks:go:container:docker -Pdocker-tag=dev
Starting expansion service at localhost:57767
Starting a Gradle Daemon, 4 busy Daemons could not be reused, use --status for details
Oct 19, 2021 12:13:33 PM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
	beam:external:java:kafkaio:externalwithmetadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@3d36e4cd
INFO: Registering external transforms: [beam:external:java:kafkaio:externalwithmetadata:v1, beam:external:java:kafkaio:typedwithoutmetadata:v1, beam:external:java:kafka:write:v1, beam:external:java:generate_sequence:v1]
	beam:external:java:kafkaio:typedwithoutmetadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@6a472554
	beam:external:java:kafka:write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@7ff2a664
	beam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$51/477289012@525b461a
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:go:container:copyLicenses UP-TO-DATE
> Task :sdks:go:container:dockerClean

> Task :sdks:go:container:goPrepare
Use project GOPATH: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/container/.gogradle/project_gopath>

> Task :sdks:go:container:resolveBuildDependencies SKIPPED
> Task :sdks:go:container:installDependencies SKIPPED
> Task :sdks:go:container:buildLinuxAmd64 UP-TO-DATE
> Task :sdks:go:container:goBuild
> Task :sdks:go:container:skipPullLicenses
> Task :sdks:go:container:dockerPrepare
> Task :sdks:go:container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 27s
8 actionable tasks: 6 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ewt372zse5nao

$ CONTAINER=apache/beam_go_sdk
$ cd sdks/go
>>> RUNNING spark integration tests with pipeline options:  -p 1 -timeout 1h --runner=spark --project=apache-beam-testing --region=us-central1 --environment_type=DOCKER --environment_config=apache/beam_go_sdk:dev --staging_location=gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test --temp_location=gs://temp-storage-for-end-to-end-tests/temp-validatesrunner-test --dataflow_worker_jar= --endpoint=localhost:41983 --test_expansion_addr=localhost:33089 --io_expansion_addr=localhost:57767 --kafka_jar=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/testing/kafka-service/build/libs/beam-sdks-java-testing-kafka-service-testKafkaService-2.35.0-SNAPSHOT.jar>
$ 
./run_validatesrunner_tests.sh: line 399: go: command not found
$ TEST_EXIT_CODE=0
$ cd ../..
$ exit 127
$ exit 127

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:41983 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_transforms
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5084 deselected / 10 selected

apache_beam/io/external/generate_sequence_test.py ...                    [ 30%]
apache_beam/transforms/validate_runner_xlang_test.py .......             [100%]

=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
=========== 10 passed, 5084 deselected, 2 warnings in 103.54 seconds ===========

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:41983 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_transforms
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5084 deselected / 10 selected

apache_beam/io/external/generate_sequence_test.py .ss                    [ 30%]
apache_beam/transforms/validate_runner_xlang_test.py .......             [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
============ 8 passed, 2 skipped, 5084 deselected in 58.48 seconds =============

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql
>>> RUNNING integration tests with pipeline options: --runner=PortableRunner --job_endpoint=localhost:41983 --environment_cache_millis=10000 --experiments=beam_fn_api
>>>   pytest options: 
>>>   collect markers: -m=xlang_sql_expansion_service
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 5094 items / 5085 deselected / 9 selected

apache_beam/transforms/sql_test.py .........                             [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml> -
================= 9 passed, 5085 deselected in 180.01 seconds ==================

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 32177.
Stopping expansion service pid: 32180.

> Task :runners:spark:2:job-server:sparkJobServerCleanup
Stopping job server pid: 31016.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/test/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 127

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 31m 20s
216 actionable tasks: 164 executed, 46 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/aqtypxcbn7mkq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org