You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/06/13 15:03:18 UTC
Build failed in Jenkins: beam_PostCommit_Python_Xlang_Gcp_Dataflow #310
See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/310/display/redirect>
Changes:
------------------------------------------
[...truncated 368.96 KB...]
[32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:112 Created dataset python_xlang_storage_write_1686668185_cbd25c in project apache-beam-testing
[32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:115 expansion port: 41493
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:406 Automatically enabling Dataflow Runner V2 since the pipeline used cross-language transforms.
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.49.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:453 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:295 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
[32mINFO [0m root:environments.py:302 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7f796ca5c0e0> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7f796ca5c9a0> ====================
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:466 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/icedtea-sound-Sx4pTbVkRyMN68Iwm-IRjHx-_IrVB7OJ04Hgf0GOshw.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/icedtea-sound-Sx4pTbVkRyMN68Iwm-IRjHx-_IrVB7OJ04Hgf0GOshw.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/jaccess-cLw_6VKUXVSSR02qNdmHfiCEQC4xgO9cGkcXeJGoXLU.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/jaccess-cLw_6VKUXVSSR02qNdmHfiCEQC4xgO9cGkcXeJGoXLU.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/localedata-m64AoS_ttn_e9NIqwSVR0lu2Z1Z5ql2mff-rP_lcNFc.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/localedata-m64AoS_ttn_e9NIqwSVR0lu2Z1Z5ql2mff-rP_lcNFc.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/nashorn-N0Jsn4-u6InD0KSL1iQKJxisnjbN1yx2Il6jtukpLvI.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/nashorn-N0Jsn4-u6InD0KSL1iQKJxisnjbN1yx2Il6jtukpLvI.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/cldrdata-5EWPOR1vxHHNVt2c5NwrXn6onAk_PiwkFY34pXx-PXY.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/cldrdata-5EWPOR1vxHHNVt2c5NwrXn6onAk_PiwkFY34pXx-PXY.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/dnsns-WBg9XKwp_aXals1SHuDLDILJwBF0Qp8_Vs6ch4cmKVc.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/dnsns-WBg9XKwp_aXals1SHuDLDILJwBF0Qp8_Vs6ch4cmKVc.jar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/beam-sdks-java-io-google-cloud-platform-expansion-service-2.49.0-SNAPSHOT-ZNHNo_gtCXtpGbJTmpVAgvs4Xcu_Is6OytDydUVUOrg.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/beam-sdks-java-io-google-cloud-platform-expansion-service-2.49.0-SNAPSHOT-ZNHNo_gtCXtpGbJTmpVAgvs4Xcu_Is6OytDydUVUOrg.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/apache_beam-2.49.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/apache_beam-2.49.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0613145628-704739-htv3r2nv.1686668188.705119/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:909 Create job: <Job
clientRequestId: '20230613145628705933-1335'
createTime: '2023-06-13T14:56:36.437869Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-06-13_07_56_36-3978289479758699563'
location: 'us-central1'
name: 'beamapp-jenkins-0613145628-704739-htv3r2nv'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-06-13T14:56:36.437869Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job with id: [2023-06-13_07_56_36-3978289479758699563]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 2023-06-13_07_56_36-3978289479758699563
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-13_07_56_36-3978289479758699563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-13_07_56_36-3978289479758699563?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-13_07_56_36-3978289479758699563?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-06-13_07_56_36-3978289479758699563 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:37.137Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-06-13_07_56_36-3978289479758699563. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:37.211Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-06-13_07_56_36-3978289479758699563.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:39.006Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.183Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.210Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.263Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.298Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey: GroupByKey not followed by a combiner.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.322Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.356Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.381Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.421Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.457Z: JOB_MESSAGE_DETAILED: Created new flatten external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29-c19 to unzip producers of external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat32
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.489Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.523Z: JOB_MESSAGE_DETAILED: Unzipping flatten external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29 for input external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat7.failedRows
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.556Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous), through flatten StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors, into producer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.579Z: JOB_MESSAGE_DETAILED: Unzipping flatten external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29-c19 for input external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat7.failedRows
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.611Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous), through flatten StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors, into producer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.632Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.663Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:3634>) into Create/Impulse
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.727Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:3634>)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.749Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.814Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.900Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.953Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter) into Create/Map(decode)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:40.985Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.010Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.040Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.068Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.094Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.119Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.145Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.168Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.198Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.231Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.256Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.275Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.306Z: JOB_MESSAGE_DETAILED: Fusing consumer StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites) into StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.349Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.373Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.396Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.426Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.549Z: JOB_MESSAGE_DEBUG: Executing wait step start34
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.601Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.651Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.679Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.911Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:41.962Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:42.021Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3634>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:56:50.801Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:57:27.885Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T14:59:59.689Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:25.132Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:25.602Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3634>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:25.669Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:26.315Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:26.380Z: JOB_MESSAGE_BASIC: Executing operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:26.533Z: JOB_MESSAGE_BASIC: Finished operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:26.603Z: JOB_MESSAGE_DEBUG: Value "StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session" materialized.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:26.671Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:46.894Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:46.966Z: JOB_MESSAGE_BASIC: Executing operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:47.892Z: JOB_MESSAGE_BASIC: Finished operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:47.951Z: JOB_MESSAGE_BASIC: Executing operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:49.785Z: JOB_MESSAGE_BASIC: Finished operation StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:49.843Z: JOB_MESSAGE_DEBUG: Executing success step success32
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:49.915Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:49.961Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:00:49.991Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:03:00.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:03:00.948Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-06-13T15:03:00.977Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-06-13_07_56_36-3978289479758699563 is in state JOB_STATE_DONE
[32mINFO [0m apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to perform query SELECT * FROM python_xlang_storage_write_1686668185_cbd25c.write_with_beam_rows to BQ
[32mINFO [0m apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of query is: [(1, 0.1, Decimal('1.11'), 'a', True, b'a', datetime.datetime(1970, 1, 1, 0, 16, 40, 100, tzinfo=datetime.timezone.utc)), (2, 0.2, Decimal('2.22'), 'b', False, b'b', datetime.datetime(1970, 1, 1, 0, 33, 20, 200, tzinfo=datetime.timezone.utc)), (4, 0.4, Decimal('4.44'), 'd', False, b'd', datetime.datetime(1970, 1, 1, 1, 6, 40, 400, tzinfo=datetime.timezone.utc)), (3, 0.3, Decimal('3.33'), 'c', True, b'd', datetime.datetime(1970, 1, 1, 0, 50, 0, 300, tzinfo=datetime.timezone.utc))]
[32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:120 Deleting dataset python_xlang_storage_write_1686668185_cbd25c in project apache-beam-testing
[32mPASSED[0m
[33m=============================== warnings summary ===============================[0m
../../build/gradleenv/2050596099/lib/python3.11/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses
from imp import load_source
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:121
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:121: DeprecationWarning: pkg_resources is deprecated as an API
warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2870: 18 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2870: 13 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2349
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2349: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/rpc/__init__.py:20
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
pkg_resources.declare_namespace(__name__)
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils import util
apache_beam/typehints/pandas_type_compatibility_test.py:67
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
}).set_index(pd.Int64Index(range(123, 223), name='an_index')),
apache_beam/typehints/pandas_type_compatibility_test.py:90
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(123, 223), name='an_index'),
apache_beam/typehints/pandas_type_compatibility_test.py:91
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(475, 575), name='another_index'),
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2028: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2034: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml> -
[33m=== [32m7 passed[0m, [33m[1m9 skipped[0m, [33m[1m6931 deselected[0m, [33m[1m56 warnings[0m[33m in 3542.40s (0:59:02)[0m[33m ====[0m
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 2831868.
Skipping invalid pid: 2831869.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 2830361
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build.gradle'> line: 96
* What went wrong:
Execution failed for task ':sdks:python:bdistPy37linux'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 1h 4m 50s
115 actionable tasks: 79 executed, 32 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/cosfc5wtcsory
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python_Xlang_Gcp_Dataflow #311
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/311/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org