You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/03/05 07:42:40 UTC

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 139 - Aborted!

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 139 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/139/ to view the results.

Jenkins build is back to normal : beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow #159

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/159/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 158 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 158 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/158/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 157 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 157 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/157/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 156 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 156 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/156/ to view the results.

Build failed in Jenkins: beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow #155

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/155/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16976 from [BEAM-14010] [Website] Add Playground

[noreply] [BEAM-12447] Upgrade cloud build client and add/cleanup options (#17032)


------------------------------------------
[...truncated 456.50 KB...]
grpc-google-iam-v1==0.12.3
grpcio==1.44.0
grpcio-gcp==0.2.2
grpcio-status==1.44.0
guppy3==3.1.2
h5py==3.6.0
hdfs==2.6.0
httplib2==0.19.1
idna==3.3
importlib-metadata==4.11.2
joblib==1.1.0
keras==2.8.0
Keras-Preprocessing==1.1.2
libclang==13.0.0
libcst==0.4.1
Markdown==3.3.6
mmh3==3.0.0
mock==2.0.0
more-itertools==8.12.0
mypy-extensions==0.4.3
nltk==3.7
nose==1.3.7
numpy==1.22.2
oauth2client==4.1.3
oauthlib==3.2.0
opt-einsum==3.3.0
orjson==3.6.7
overrides==6.1.0
packaging==21.3
pandas==1.4.1
parameterized==0.7.5
pbr==5.8.1
pip==21.2.4
pluggy==0.13.1
proto-plus==1.20.3
protobuf==3.19.4
psycopg2-binary==2.9.3
py==1.11.0
pyarrow==6.0.1
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.21
pydot==1.4.2
PyHamcrest==1.10.1
pymongo==3.12.3
PyMySQL==1.0.2
pyparsing==2.4.7
pytest==4.6.11
pytest-forked==1.4.0
pytest-timeout==1.4.2
pytest-xdist==1.34.0
python-dateutil==2.8.2
python-snappy==0.6.1
pytz==2021.3
PyYAML==6.0
regex==2022.3.2
requests==2.27.1
requests-mock==1.9.3
requests-oauthlib==1.3.1
rsa==4.8
scikit-learn==1.0.2
scipy==1.8.0
setuptools==58.1.0
six==1.16.0
soupsieve==2.3.1
SQLAlchemy==1.4.31
tenacity==5.1.5
tensorboard==2.8.0
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.1
tensorflow==2.8.0
tensorflow-io-gcs-filesystem==0.24.0
termcolor==1.1.0
testcontainers==3.4.2
tf-estimator-nightly==2.8.0.dev2021122109
threadpoolctl==3.1.0
tqdm==4.63.0
typing-inspect==0.7.1
typing-utils==0.1.0
typing_extensions==4.1.1
uritemplate==4.1.1
urllib3==1.26.8
wcwidth==0.2.5
websocket-client==1.3.1
Werkzeug==2.0.3
wheel==0.37.1
wrapt==1.13.3
zipp==3.7.0
Removing intermediate container e6df08f07798
 ---> 78f0265171fd
Step 16/31 : COPY target/LICENSE /opt/apache/beam/
 ---> 964c99575a35
Step 17/31 : COPY target/LICENSE.python /opt/apache/beam/
 ---> 19d6fcf44785
Step 18/31 : COPY target/NOTICE /opt/apache/beam/
 ---> e170c2af2fde
Step 19/31 : COPY target/launcher/linux_amd64/boot /opt/apache/beam/
 ---> 50a8a2ccc863
Step 20/31 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 63394ca01b3e
Removing intermediate container 63394ca01b3e
 ---> c53e9538a3eb
Step 21/31 : FROM beam as third_party_licenses
 ---> c53e9538a3eb
Step 22/31 : ARG pull_licenses
 ---> Running in 2e2a647eb377
Removing intermediate container 2e2a647eb377
 ---> bee1195ca741
Step 23/31 : COPY target/license_scripts /tmp/license_scripts/
 ---> a98c7f0c0898
Step 24/31 : COPY target/LICENSE target/go-licenses/* /opt/apache/beam/third_party_licenses/golang/
 ---> 72a0903f59d5
Step 25/31 : RUN rm /opt/apache/beam/third_party_licenses/golang/LICENSE
 ---> Running in aaa2b8187aab
Removing intermediate container aaa2b8187aab
 ---> 8da366809963
Step 26/31 : COPY target/license_scripts /tmp/license_scripts/
 ---> b8f314127755
Step 27/31 : RUN if [ "$pull_licenses" = "true" ] ; then       pip install 'pip-licenses<3.0.0' pyyaml tenacity &&       python /tmp/license_scripts/pull_licenses_py.py ;     fi
 ---> Running in 60bb0c7ca558
Collecting pip-licenses<3.0.0
  Downloading pip_licenses-2.3.0-py3-none-any.whl (14 kB)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.9/site-packages (6.0)
Requirement already satisfied: tenacity in /usr/local/lib/python3.9/site-packages (5.1.5)
Collecting PTable
  Downloading PTable-0.9.2.tar.gz (31 kB)
Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.9/site-packages (from tenacity) (1.16.0)
Building wheels for collected packages: PTable
  Building wheel for PTable (setup.py): started
  Building wheel for PTable (setup.py): finished with status 'done'
  Created wheel for PTable: filename=PTable-0.9.2-py3-none-any.whl size=22925 sha256=533c3de48de666b0a4bc20216d1a87dd6d9edab795795256ae24d230832fa9ad
  Stored in directory: /root/.cache/pip/wheels/b8/d5/8b/e0c9765594e0dc8093aae5f67eacc08b9b533da598c710b54a
Successfully built PTable
Installing collected packages: PTable, pip-licenses
Successfully installed PTable-0.9.2 pip-licenses-2.3.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
WARNING: You are using pip version 21.2.4; however, version 22.0.4 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
INFO:root:Successfully pulled licenses for 137 dependencies
Skip pulling license for  bs4
Removing intermediate container 60bb0c7ca558
 ---> fc056214359a
Step 28/31 : FROM beam
 ---> c53e9538a3eb
Step 29/31 : ARG pull_licenses
 ---> Running in 05a8182b9767
Removing intermediate container 05a8182b9767
 ---> 5294da93d3ee
Step 30/31 : COPY --from=third_party_licenses /opt/apache/beam/third_party_licenses /opt/apache/beam/third_party_licenses
 ---> 1dd19c2cafac
Step 31/31 : RUN if [ "$pull_licenses" != "true" ] ; then       rm -rf /opt/apache/beam/third_party_licenses ;     fi
 ---> Running in f8974740d1f3
Removing intermediate container f8974740d1f3
 ---> 35a319a504ea
Successfully built 35a319a504ea
Successfully tagged apache/beam_python3.9_sdk:2.38.0.dev

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerPythonContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker


> Task :runners:google-cloud-dataflow-java:setupXVR
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerSetup
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
E0309 06:27:23.975501298  453048 fork_posix.cc:70]           Fork support is only compatible with the epoll1 and poll polling strategies

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages FAILED
Error: No such image: us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220309060328
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220309060328]

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
Error: No such image: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220309060328
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220309060328]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 335

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerPythonImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 294

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 34m 28s
153 actionable tasks: 105 executed, 42 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/hgjjgajs3xxfa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 154 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 154 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/154/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 153 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 153 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/153/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 152 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 152 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/152/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 151 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 151 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/151/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 150 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 150 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/150/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 149 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 149 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/149/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 148 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 148 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/148/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 147 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 147 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/147/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 146 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 146 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/146/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 145 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 145 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/145/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 144 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 144 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/144/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 143 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 143 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/143/ to view the results.

Build failed in Jenkins: beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow #142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/142/display/redirect>

Changes:


------------------------------------------
[...truncated 427.50 KB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:spanner_replace:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@15b204a1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:spanner_insert_or_update:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@77167fb7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:spanner_delete:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@1fe20588'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:spanner_read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@6ce139a4'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:kafka_read_with_metadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@6973bf95'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:kafka_read_without_metadata:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@2ddc8ecb'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:transform:org.apache.beam:kafka_write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@229d10bd'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@47542153'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 06, 2022 12:27:54 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 06, 2022 12:27:54 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 06, 2022 12:27:55 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 06, 2022 12:27:59 AM org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner convertToBeamRelInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: BEAMPlan>'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT], expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1], str=[$t2], flt=[$t3])"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  BeamValuesRel(tuples=[[{ 0 }]])'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b''
INFO     apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING  root:environments.py:371 Make sure that locally built Python SDK docker image has Python 3.9 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.9_sdk:2.38.0.dev
INFO     root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python39-fnapi:beam-master-20220208
INFO     root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python39-fnapi:beam-master-20220208" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function pack_combiners at 0x7f942cbeef70> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sort_stages at 0x7f942ceb6790> ====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:461 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/icedtea-sound-ebvtNFkfFXg4aaYFuDnwKpwDSjzsaZqlqv5iKxPTr-U.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/icedtea-sound-ebvtNFkfFXg4aaYFuDnwKpwDSjzsaZqlqv5iKxPTr-U.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/jaccess-ULFTCPsb6cLYZ0f1BG1FQfczmHNaZCx8plXuRDKpBqE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/jaccess-ULFTCPsb6cLYZ0f1BG1FQfczmHNaZCx8plXuRDKpBqE.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/localedata-dUHqyGxaTVCjfTI8MckPYarZ3_mwf62udkxaHi1aKns.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/localedata-dUHqyGxaTVCjfTI8MckPYarZ3_mwf62udkxaHi1aKns.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/nashorn-XdUndQGroXOP9NCsfITpBERYcbbGXVHLjbvNWXCh-3A.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/nashorn-XdUndQGroXOP9NCsfITpBERYcbbGXVHLjbvNWXCh-3A.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/cldrdata-YqzuKX1QnLCOo0cwjKRdBhGrip_ltIJZg-APT60tUPA.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/cldrdata-YqzuKX1QnLCOo0cwjKRdBhGrip_ltIJZg-APT60tUPA.jar in 1 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/dnsns-dhEp186udEF6X6chZus-RJzWRmzlccxx1_btlXWayVI.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/dnsns-dhEp186udEF6X6chZus-RJzWRmzlccxx1_btlXWayVI.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/beam-sdks-java-extensions-sql-expansion-service-2.38.0-SNAPSHOT-wnEH3TwLl--ewbQLSyIo0YyXrQmxFf_wU1cIgphBJIo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/beam-sdks-java-extensions-sql-expansion-service-2.38.0-SNAPSHOT-wnEH3TwLl--ewbQLSyIo0YyXrQmxFf_wU1cIgphBJIo.jar in 33 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:886 Create job: <Job
                                                                           clientRequestId: '20220306002805163211-1790'
                                                                           createTime: '2022-03-06T00:28:42.691642Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2022-03-05_16_28_41-10243177491131866728'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-0306002805-161913-f9kotkri'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2022-03-06T00:28:42.691642Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 Created job with id: [2022-03-05_16_28_41-10243177491131866728]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:889 Submitted job: 2022-03-05_16_28_41-10243177491131866728
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-05_16_28_41-10243177491131866728?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2022-03-05_16_28_41-10243177491131866728 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:43.381Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-03-05_16_28_41-10243177491131866728. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:43.563Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-03-05_16_28_41-10243177491131866728.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:45.670Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.468Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.492Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.552Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.587Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.623Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.647Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.698Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.725Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.753Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.806Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15 for input ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_13.None-post11
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.832Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten, into producer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.863Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.890Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/RestoreTags into assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.920Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.949Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:46.978Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.022Z: JOB_MESSAGE_DETAILED: Fusing consumer SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.060Z: JOB_MESSAGE_DETAILED: Fusing consumer external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction into SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.092Z: JOB_MESSAGE_DETAILED: Fusing consumer external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing into external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.140Z: JOB_MESSAGE_DETAILED: Fusing consumer SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc) into external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/ProcessElementAndRestrictionWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.177Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.213Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:3229>) into assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.268Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:3229>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.292Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.319Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.353Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.383Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.412Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.451Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.472Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.498Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.529Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.684Z: JOB_MESSAGE_DEBUG: Executing wait step start23
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.756Z: JOB_MESSAGE_BASIC: Executing operation SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.786Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.808Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.833Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.897Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:47.975Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:28:48.081Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3229>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:29:07.612Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:29:39.583Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:30:26.042Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:30:53.688Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:31:42.775Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:32:08.806Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:32:54.580Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:33:23.775Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:34:11.728Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:34:40.450Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.088Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.117Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305 not found: manifest unknown: Failed to fetch "20220306000305" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220306000305".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.176Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3229>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.248Z: JOB_MESSAGE_WARNING: Unable to delete temp files: "gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0306002805-161913-f9kotkri.1646526485.162291/dax-tmp-2022-03-05_16_28_41-10243177491131866728-S03-0-baa91ec93493b9c2/tmp-baa91ec93493b043@DAX.sdfmeta."
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.283Z: JOB_MESSAGE_WARNING: S03:SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing failed.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.321Z: JOB_MESSAGE_BASIC: Finished operation SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.383Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.447Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.479Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:35:26.868Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:36:13.892Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-03-06T00:36:13.928Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2022-03-05_16_28_41-10243177491131866728 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

<unknown>:54
<unknown>:54
<unknown>:54
<unknown>:54
<unknown>:54
<unknown>:54
<unknown>:54
<unknown>:54
  <unknown>:54: DeprecationWarning: invalid escape sequence \c

<unknown>:62
<unknown>:62
<unknown>:62
<unknown>:62
<unknown>:62
<unknown>:62
<unknown>:62
<unknown>:62
  <unknown>:62: DeprecationWarning: invalid escape sequence \d

<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml> -
=================== 9 failed, 28 warnings in 1095.39 seconds ===================

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages FAILED
Error: No such image: us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220306000305
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220306000305]

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
Error: No such image: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220306000305]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 335

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerPythonImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 294

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 34m 20s
153 actionable tasks: 105 executed, 42 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/cn7o4n5talups

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 141 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 141 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/141/ to view the results.

beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 140 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow - Build # 140 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/140/ to view the results.