You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/06/25 19:25:08 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #1358

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1358/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Add implementation for from_dict, from_records (#15034)

[noreply] [BEAM-11951] added "Differences from Pandas" page for DataFrame (#15074)

[noreply] [BEAM-9547] reindex is order-sensitive (#15032)


------------------------------------------
[...truncated 45.88 MB...]
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYJmaxgABPZKJBYnJGanxSamJuXrJ+SmpRcVQakqPsFticUlAUWZuZklmWWqxM0h4yuQpmo1Taqf08MenJebkJCUmZ8eD1U/JYOjhDshMzs5JRVGYVJykBwDOUCqY",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYJmaxgABPZKJBYnJGanxSamJuXrJ+SmpRcVQakqPsFticUlAUWZuZklmWWqxM0h4yuQpmo1Taqf08MenJebkJCUmZ8eD1U/JYOjhDshMzs5JRVGYVJykBwDOUCqY",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_7"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYJmaxgABPZKJBYnJGanxSamJuXrJ+SmpRcVQakqPsFticUlAUWZuZklmWWqxM0h4yuQpmo1Taqf08MenJebkJCUmZ8eD1U/JYOjhDshMzs5JRVGYVJykBwDOUCqY",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_7"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_7"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "Assert Checksums/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s27"
        },
        "serialized_fn": "QlpoOTFBWSZTWXPtzvQAAzD//n///8vWd//3P//c57////Z/9Q5BcAAAAQABARBQBF7c94dO9d723mr17e26GpiEJppMSek2k9QeUyb0npR6nk0nppB6hpkAD1B6h4oHqB6g9qgyAGjTQamU00Yp6TSnmEaCjxCeoAHqekDIAAAAAaAADaTTygABomjIhU9lT9UyPRojTT1G1Ggeo0ZNAAAGg0AAAA0GIABtQaaVPUeqbUNNGQNNAABpoAAAAAAAAAAAAAAEqaJqp+mo02hpTyGKGhowNAAGoG0jJ6JgIwAACMBMJpo0aHUFqnLlG/7RWWYbs8o+AxMVjSE/zjvJneYiNG5+2j6sGjBEQi/IotXLRPPa6qLKQ/DefSCRAWnNLkx2zylMCGnPpJYA8wVPvzXTwA8nnnkk0RP+u6lJiYSJZPhdwLlRCKnH9x0YQMQhBFwDy2JjAWhsUgdnTKSMxyImgVJga2DICc5SRwMFikOleFA7AKNBRSASgaQyTV3NRCssGIjzUyekXI2zKJSNcbJ4Z4B/q0DMnjoEoTapjoEpTW2+XNF33RkaA/bGhKIhON/ehgSNphAQxi8Fh1gv38eDv6WDKvNZAUsOUm35a8CRFAMchxoZJebo0cCqedPDRP7VjWu7S35IOxo609vEou9GGqu47xZJSVU75AgaFccOUi2Vct0Z4xKNi/kgQhi94BqAFIMi38nY9lbHFtmKwqanSIOwVUeOTMEYrlo0G5AGPodTwSWNoIn9Ok15+eQghyHh27TG0GhA3I1NNAbU8U4KE1yL2U98XHzMsIBMUoQCbMFQDmeTDhtJ6RkuZL+zedlKW0iEDmqFjQr+bVtzt5EJgBWHQhj3KQ+qQcUvkiABHzY6Foh5DCUYaXa/gdq5pV46KeqyPGixVo6kBFB5t1HY4ikejko6StI022A8g4JWAPeYmJ7GCZStnMYHqVbY71iRPbMAkBkZDWcp7ormWBKYhNIaFB8J4maa6irsMlRJXxh4eK2G6BChlYDspDIo0gIyFOgPQHryKkhcKhVdczZxylAIcyAPFko/CxRDAVsyyYAyosHWJTMzEChymBLfhASarYoIBw35eaCG/uzgD1M3gASCYYZ23TB/EFFYnFeCyhOCnA+x8jUnUaGvqRAZGZB4EJAKJXIZFxGtEJ/Pyoeh8HGmABgOUoaga6aAY5SoWCwcWohAjFICKnuIxAXQSbcQBTUHysgGCBStRVQ1UirIgQ2zFQvuI34RPGD7pIA6gX7FCLpRAlVLro29eO1mDQFENv8QgHqZz3BaBtXo2VRAdpBP1AIRlShElyzDka8ICPvMYwRl3DFkkkLmLUwtBhOtpcVynlkkeCkX0nA1c6IboLFGSKQTAgDB1g6wsKjfN/BP/r+WWe85YKvGLjqYYoYPviWAMjN2v1LtgBEunCQoW3J104sU3FdZFIJREbFYUSLwz/KOECepsjD3KAcD1V6924jDI3r2fZpYpxNKIGh+pCsa7TcBWSrUVB2uJSgRQCyCC9DPECHXpJDO24mr5SVlpEqMNp9K9Kw8jwqGsVQLXAsHGISvMpV0JEbEGWIqIWUYAiQXcaz9NMXxBOGUJBKI4QilseqEo6jFrywhDEYWSOGqoVh0wBMJShtP7mxnAhthFAglezTje0pAscKRuT3jRyEiBDcLHSp9FAqTfIUYwyTfWnBQXg0znQpvtQxS3+bSahZulY3Rm56eTBFiZMSWHJKqA7qqtaXWGqk1ZX1LW5bE04CLC1fz2keME1aM5uPbeavE2SbfGmkXN/4u5IpwoSDn253o",
        "user_name": "Assert Checksums/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2021-06-25T18:47:46.825282Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-06-25_11_47_45-903897993043844875'
 location: 'us-central1'
 name: 'beamapp-jenkins-0625184738-244409'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-06-25T18:47:46.825282Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2021-06-25_11_47_45-903897993043844875]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2021-06-25_11_47_45-903897993043844875
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_47_45-903897993043844875?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-06-25_11_47_45-903897993043844875 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:49.132Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-06-25_11_47_45-903897993043844875. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:49.234Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-06-25_11_47_45-903897993043844875.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.074Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.776Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.810Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Assert Checksums/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.842Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Matched Files/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.868Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.896Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:51.989Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.019Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.043Z: JOB_MESSAGE_DETAILED: Unzipping flatten s9 for input s7.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.069Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched Files/Group/GroupByKey/Reify, through flatten Matched Files/Group/Flatten, into producer Matched Files/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.092Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Reify into Matched Files/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.115Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/GroupByWindow into Matched Files/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/Map(_merge_tagged_vals_under_key) into Matched Files/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Unkey into Matched Files/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.224Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Match into Matched Files/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.263Z: JOB_MESSAGE_DETAILED: Unzipping flatten s9-u22 for input s10-reify-value0-c20
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.297Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched Files/Group/GroupByKey/Write, through flatten Matched Files/Group/Flatten/Unzipped-1, into producer Matched Files/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Write into Matched Files/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.364Z: JOB_MESSAGE_DETAILED: Fusing consumer MatchAll/ParDo(_MatchAllFn) into Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.398Z: JOB_MESSAGE_DETAILED: Fusing consumer GetPath into MatchAll/ParDo(_MatchAllFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/WindowInto(WindowIntoFn) into GetPath
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.465Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/ToVoidKey into Matched Files/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_1 into Matched Files/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.532Z: JOB_MESSAGE_DETAILED: Unzipping flatten s24 for input s22.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.568Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Assert Checksums/Group/GroupByKey/Reify, through flatten Assert Checksums/Group/Flatten, into producer Assert Checksums/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.602Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Reify into Assert Checksums/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.624Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/GroupByWindow into Assert Checksums/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/Map(_merge_tagged_vals_under_key) into Assert Checksums/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.722Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Unkey into Assert Checksums/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.793Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Match into Assert Checksums/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.822Z: JOB_MESSAGE_DETAILED: Unzipping flatten s24-u29 for input s25-reify-value9-c27
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.849Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Assert Checksums/Group/GroupByKey/Write, through flatten Assert Checksums/Group/Flatten/Unzipped-1, into producer Assert Checksums/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.883Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Write into Assert Checksums/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.904Z: JOB_MESSAGE_DETAILED: Fusing consumer MatchOneAll/ParDo(_MatchAllFn) into SingleFile/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.931Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadMatches/ParDo(_ReadMatchesFn) into MatchOneAll/ParDo(_MatchAllFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.958Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadIn into ReadMatches/ParDo(_ReadMatchesFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:52.992Z: JOB_MESSAGE_DETAILED: Fusing consumer Checksums into ReadIn
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.029Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/WindowInto(WindowIntoFn) into Checksums
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.063Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/ToVoidKey into Assert Checksums/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.097Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_1 into Assert Checksums/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.153Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_0 into Matched Files/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_0 into Assert Checksums/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.214Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.254Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.278Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.311Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.529Z: JOB_MESSAGE_DEBUG: Executing wait step start40
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.595Z: JOB_MESSAGE_BASIC: Executing operation Matched Files/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.615Z: JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.640Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.665Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.938Z: JOB_MESSAGE_BASIC: Finished operation Assert Checksums/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:53.953Z: JOB_MESSAGE_BASIC: Finished operation Matched Files/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.010Z: JOB_MESSAGE_DEBUG: Value "Assert Checksums/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.044Z: JOB_MESSAGE_DEBUG: Value "Matched Files/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.065Z: JOB_MESSAGE_BASIC: Executing operation SingleFile/Read+MatchOneAll/ParDo(_MatchAllFn)+ReadMatches/ParDo(_ReadMatchesFn)+ReadIn+Checksums+Assert Checksums/WindowInto(WindowIntoFn)+Assert Checksums/ToVoidKey+Assert Checksums/Group/pair_with_1+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.086Z: JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Create/Read+Assert Checksums/Group/pair_with_0+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.109Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+MatchAll/ParDo(_MatchAllFn)+GetPath+Matched Files/WindowInto(WindowIntoFn)+Matched Files/ToVoidKey+Matched Files/Group/pair_with_1+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:47:54.167Z: JOB_MESSAGE_BASIC: Executing operation Matched Files/Create/Read+Matched Files/Group/pair_with_0+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:16.650Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.215Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-b failed to bring up any of the desired 1 workers. ZONE_RESOURCE_POOL_EXHAUSTED: The zone 'projects/apache-beam-testing/zones/us-central1-b' does not have enough resources available to fulfill the request.  Try a different zone, or try again later.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.248Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.309Z: JOB_MESSAGE_BASIC: Finished operation Matched Files/Create/Read+Matched Files/Group/pair_with_0+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.309Z: JOB_MESSAGE_BASIC: Finished operation Assert Checksums/Create/Read+Assert Checksums/Group/pair_with_0+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.309Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+MatchAll/ParDo(_MatchAllFn)+GetPath+Matched Files/WindowInto(WindowIntoFn)+Matched Files/ToVoidKey+Matched Files/Group/pair_with_1+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.310Z: JOB_MESSAGE_BASIC: Finished operation SingleFile/Read+MatchOneAll/ParDo(_MatchAllFn)+ReadMatches/ParDo(_ReadMatchesFn)+ReadIn+Checksums+Assert Checksums/WindowInto(WindowIntoFn)+Assert Checksums/ToVoidKey+Assert Checksums/Group/pair_with_1+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.386Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.440Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:48:42.470Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:49:00.301Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-25T18:49:00.332Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-06-25_11_47_45-903897993043844875 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_52-18226877826297016961?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_20_25-15838651103486671101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_30_19-17827334232817228002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_40_14-10003829187723124260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_49_26-13670809108103562383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_59_02-5195369128676074703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_09_23-4714144350021199790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_49-8530830228396904476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_07_37-13410422423879732140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_17_14-13172787436277174181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_27_56-3984525175953514029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_36_19-13972923185297148406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_45_17-3117090674279066696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_02_21-15619549274894526316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_51-3886505550945307566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_17_46-6868409229442495975?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_27_01-14573599409812393391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_28_51-8242596268522185891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_38_06-739236122465285670?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_47_16-10328278082315704359?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_55_46-8933368316105334856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_03_49-1956592646179283096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_12_42-15612162020043473665?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_44-11727157232814909638?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_26_01-12448341635749353521?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_35_21-13642065566894992113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_44_15-10063212224438364065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_45_50-15801414817585230110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_55_30-13946431077937170950?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_05_24-16789194774772529261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_16_02-9162122687204631772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_07_58-13280672903500394014?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_18_45-9599355879664676921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_26_44-16907709443784567607?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_34_52-6750335157970152638?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_44_47-13909878701581311364?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_53_45-16066363980485835795?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_01_48-15119867450007438380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_44-17302103234265718973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_16_04-13155184605278335992?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_17_49-14734494661597779015?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_19_39-16838773419817287131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_28_34-2618367994292999070?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_38_27-1876984475057640866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_47_30-18426505956985892596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_55_58-17046828641720307549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_03_47-3821735104069258239?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_50-16587322185640700781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_16_43-2827516745032222373?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_26_04-13879138396949212511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_37_35-5849292321630018847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_47_45-903897993043844875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_49_25-16709414129996467647?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_59_56-18291686503831951437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_05_44-4818489321924140732?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_14_53-6766077203812331024?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_31_58-7287426924173949929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_11_57_28-9498650561115282683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_12_06_12-1106603568271098834?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 71 tests in 4789.341s

FAILED (SKIP=8, errors=6)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 126

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 42s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/e6ydjuo4nqfpa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #1360

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1360/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1359

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1359/display/redirect?page=changes>

Changes:

[noreply] Minor: Fix incorrect s.apache.org links (#15084)

[noreply] Minor: Announce DataFrame API leaving experimental in 2.32.0 (#15053)

[noreply] [BEAM-9547] Add support for xs on DataFrame and Series (#15078)


------------------------------------------
[...truncated 45.66 MB...]
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "ReadFromPubSub/Read.out"
          }
        ],
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input3b85bba5-1994-47ed-b8c8-2bd1a17c9608",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s3",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "bytes_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub-ToProtobuf_6",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s4",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s3"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output3b85bba5-1994-47ed-b8c8-2bd1a17c9608",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2021-06-26T01:03:06.472759Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-06-25_18_03_05-15633346651121477160'
 location: 'us-central1'
 name: 'beamapp-jenkins-0626010256-527149'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-06-26T01:03:06.472759Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2021-06-25_18_03_05-15633346651121477160]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2021-06-25_18_03_05-15633346651121477160
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-25_18_03_05-15633346651121477160?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-06-25_18_03_05-15633346651121477160 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:11.639Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.603Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.655Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.712Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.755Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.790Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.822Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.868Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.899Z: JOB_MESSAGE_DETAILED: Fusing consumer modify_data into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.920Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into modify_data
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:13.952Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.047Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input3b85bba5-1994-47ed-b8c8-2bd1a17c9608 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.081Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.340Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.363Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.455Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:14.636Z: JOB_MESSAGE_DEBUG: Executing wait step start19
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:15.874Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input3b85bba5-1994-47ed-b8c8-2bd1a17c9608'.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:15.950Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+modify_data+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:15.998Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:16.021Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:03:43.533Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:04:05.358Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:04:45.359Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-06-26T01:04:45.397Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2021-06-25_18_03_05-15633346651121477160 after 184 seconds
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 244
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output3b85bba5-1994-47ed-b8c8-2bd1a17c9608.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 71 tests in 5600.979s

FAILED (SKIP=8, failures=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 126

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 37m 34s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/b3x2lysgsogu4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org