You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/02/02 01:33:00 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #3401

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3401/display/redirect?page=changes>

Changes:

[zyichi] Setup InfluxDbIO_IT jenkins job cron

[Kyle Weaver] [BEAM-10379] Remove BIT_XOR from ZetaSQL supported functions list.

[Kyle Weaver] [BEAM-11732] Revert flink-clients from runtime to compile configuration.

[noreply] [BEAM-11731] Restrict to numpy <1.20.0 (#13870)

[noreply] [BEAM-11357] Copy Annotations when cloning PTransforms (#13865)

[noreply] [BEAM-11693]  Update formatting. Fix email template (#13815)


------------------------------------------
[...truncated 50.89 MB...]
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s8",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "RemoveExtractedFiles",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_read_internal.RemoveExtractedFiles"
          }
        ],
        "non_parallel_inputs": {
          "python_side_input0-read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s6"
          },
          "python_side_input1-read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s7"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s5"
        },
        "serialized_fn": "QlpoOTFBWSZTWblQkJkABoV//////////////////////////////////////kAQCAAg4Abw6Z9aDNaNWbIg6yE52GijRNCaNDSaaZpNPRMmEwT1MT1A2pppgg0ZAaMhkAANAaBoBk0aGjQ0aBkNAaAaaBoaMhkZknqAGggSegTTTQmQRMmgyNNAZAANBpkAAaBoAAGho0NB6QAAAAAAAAAAAAAEAADQABkABoABoyaAADIDQAAAA0ADTTTQGmgDQyGQyaAADQNABoNGJkCKenqlTynptUeoGIYmQwCMTJowAg0ZNNDE0yDBDTRiDEwExBoaGI0xMCDQxGQZGmCZDCAaMJtJhAAA0AAZAAaAAaMmgAAyA0AAAANAA0000BpoA0MhkMmgAA0DQAaDRiZASRTRoQTCp7VPU/Sj0yanpPEjyjE8ifqQZAAAAAAAAABoAAAAAAAAyAAHqAAAANVqLQpnGrvUtoSqe5gkKqNlmdF/NZQoviamMcVLMDqHyY2M5zXnhTThjbtuRk80JBWJsfEREAKLOdUi0xGMcZlFMiIZXSoREOFG9G1RqVCEC1sBmIO6q14tumm5Ze7D/aiu84gy5wtfjKBdquLf5HBYJJXWJArupS1N1czBC3OXPwy5Q20yDwVChxOtNCEwQSIwgkkwQBJaBFMCQZT1FOUXT6xmYCCQrNUkQMkDqA9AcdUxrCEkmrkY7GlBYhMhUIoUkMBUQKERmBVIzBqltUaEQz2NCzCGgUGQOrSHpWT12BusZGkWYXQQsvxgXhIzXg05oJE3KpZDYOLuQp8RGYTWQcbIlwba38PuaQNm2GXnlN81E5EHrtoN1zG/P6XVsu0qqqqquUUQi6+t7fvwjjGs1rQZBGRIsiEEGyGWWOKGZTKoCZoMOf1E2Uo74zSklqZJTSSjud0bUIDhiZ14NiDKYPaAwI1UMQsEB6MRJlSGoJmvziWclppmIy1nZizlsUA6awYkg0yNVxi3PGQNscSPeiEdWj0aaTmdNB46RNoXJpLOQxok9ImGddYJr96FUZl49hbb6vyoGjynHe3wzf6gdANWyQMxRBcFBABdg6XZ4zcztHwOFxKuwjH2msep4/N8mkiZnjYJrJwsy+KKBwFBDV6qqpYYwkvYpknNheDcyyHqiiMfkkpWwFruJjX1lXBzGVaMKh6quHHs6m/nU95WL89XZvk4eTc8Xi0LJSptEmTuxVrWKVB2IPQikqlo9a5sq2UUKfBypnmGYaWKiBSjICNQwLHo0zhCaAAWkwrg14C6HoiI5dgJ0LG9M4xUdwmUHHkoitmjRGEsgbWrBEohF6Sgd9TxlqXTPNokRg1onYSNblVKWpasQDrZ5FeV4ZTCFTArhY8QTRjApAYMwRmMIZWT9QLYMavS1xAnkOKV3o3CcXNXdUmAFmcSLZbNwImd+EMQn5JGrzpyhlS6tlb1vZ1nsaySoLSwhe87tneB62cCUpT1D52LNJUm3mjAwilCIggNVQDbAgk0SElplAJnaKhNTp6NxbmOCkY8qJ5tI6YZMmGa9PVVlgsSWJHhlshEAL5LIHsgqjT1IlvHyGbRlUk6Kh5axlF4iQZnU+YqZXa8JGbFpiUJyQpO/GUbghUEgftOq3YEu4qZxjtcAtKbdfNHskKgo0B8F8EB0pDGlDWCqKPWz7CaUTMhxXGEPBEpcpegoU8BUEo11c/iYNjEWFAL+C+CjW0TyiCFkDROoJ6B5Vg3sQCOb9fy/NRrCuQUDFubXx/VZwb+Iq8O3cFOT0WzZgiGEz4hbRWHhB3EoYMASwVuQ/iaF1fTcAJsGJwXjuaDyCWxNA3bylGWKIQcqUWJIBD8UNCUIDAETmkYMIYnqZhohgZmKXsKrKGNi6HfwqS5rGMbG0hsEpFZW+ONgHZBWA7bEhTLflA3VM7oQDGEYBtNJEW5AsrWVZNAdti4+QMUWcIHsySvPJ5LknFrCJTIqJTm8MoJCLSeHzI1+0AHYBQGGIp3Uo9M1+5JAVW6PEnWoioil9+4KFTPUadidkSld1FWWq4U1NUE8XdkMD6XUQBmZieqTMpAqii7EFPIDDjhAkJMTbCVNXv0YTRw3TIZIACCzMsocA5gYIlcUBDD3KTK78CtJQLgKD8hMoUIJISFSQDTJCYKwaVGMQsBEKNEQmMhjoUj3AS2BXEVJjXRmjIw4cOMPBbtSxdtt3Wte58EayX5MCSNSvdEtRbg8JjISrO1KJEYO1tSGVyhDc1EefVSOc5qQcHLqWLfRq2t/Nh4rKbDu5bGU3KrbIIw4vdgOEEEtgTXyNuVY/fmqadCFxhat0eoTAAAAAAAAWABaAFFizSJrQwK0zMzMzOaxFO7I9jDMzM3RVZ1VESeZVbPLMHmZFRYLFIggxcPDrDEW6M11YoUHOU6Vo6BgJUaDKMYx7dTJKaGoSy2ESRZ64yJ0TW7L72XJjBIo0JM2QMoGjJsAuvRUzObLcP78tB9Iz4flCE4K8NCM8mgGuClbgqjXaFgYLyLePE+2MIgBBoVQWfmq+XWJs2QnGmov0UIAYNEptDImbS8UQBG0wHEqK8g8hUgvWwUhNF8TiaWJmM2LVc1DY0S+3oNO1JhItUrXB2S35iqIqlmtTEJ80Yz1leSBGKDWw+38/WtlaA+cmV0hi3xYV7XsuOEWS/MDaXLgFxRzpC+WSWZFgM1gis9drlWCw2oZ2TMqxLQykjfY5CGUqLRMydzSMDGHwTpjFgwo3faZOOZc6xuDRIA1ZA0R3xgrmJQ9y9x87q7xv0+15eufv+Gd9funeB+XjaEH+kUhkyAKGS5baJBg6RRQkbt02nzeOXU51mEiMDa5o16p1y1/Jh1CUjdSjMNcWAhBNiEkEZmkZGZGkZGZGkZGZN/jBNgjs7QkAZgZkCP4tf7Pn/+/8XckU4UJC5UJCZA",
        "user_name": "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s9",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_16122269378458",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": [],
              "pipeline_proto_coder_id": "ref_Coder_RowAsDictJsonCoder_6"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s4"
        },
        "schema": "{\"fields\": [{\"name\": \"bytes\", \"type\": \"BYTES\", \"mode\": \"NULLABLE\"}, {\"name\": \"date\", \"type\": \"DATE\", \"mode\": \"NULLABLE\"}, {\"name\": \"time\", \"type\": \"TIME\", \"mode\": \"NULLABLE\"}]}",
        "table": "output_table",
        "user_name": "write/Write/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2021-02-02T00:49:16.137221Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-01_16_49_14-9714962978420022859'
 location: 'us-central1'
 name: 'beamapp-jenkins-0202004903-410513'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-02T00:49:16.137221Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2021-02-01_16_49_14-9714962978420022859]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2021-02-01_16_49_14-9714962978420022859
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_49_14-9714962978420022859?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-02-01_16_49_14-9714962978420022859 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:17.149Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-01_16_49_14-9714962978420022859. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:17.383Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-01_16_49_14-9714962978420022859.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:19.666Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.363Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.413Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.443Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.474Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.527Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.565Z: JOB_MESSAGE_DETAILED: Fusing consumer read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) into read/ReadFromBigQuery/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.600Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/NativeWrite into read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.651Z: JOB_MESSAGE_DETAILED: Fusing consumer read/ReadFromBigQuery/MapFilesToRemove into read/ReadFromBigQuery/FilesToRemoveImpulse/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.674Z: JOB_MESSAGE_DETAILED: Fusing consumer read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles) into read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.699Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.757Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.793Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:21.824Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:22.074Z: JOB_MESSAGE_DEBUG: Executing wait step start5
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:22.142Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/FilesToRemoveImpulse/Read+read/ReadFromBigQuery/MapFilesToRemove
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:22.167Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:22.189Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:22.242Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:31.967Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:49:58.556Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:50:28.529Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-02T00:50:28.574Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2021-02-01_16_49_14-9714962978420022859 after 901 seconds
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_59-14504190380788989963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_20_09-17189665073122789692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_27_56-17240849923373307516?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_35_05-13810374468025460197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_42_37-3517011084477398838?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_50_19-16657104625730858380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_16_51-13529049232442472614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_24_08-12828994699296109745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_51-17481085145894087626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_17_48-5477317531507128994?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_26_34-1084251453786456347?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_35_00-5717798533444923126?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_43_19-18238028526009046241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_51_34-11742979885795319261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_16_37-13023371632620765584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_54-11847709388384921677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_28_28-12234177234908271774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_37_07-16417990800306497873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_45_07-3515735303970002119?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_53_00-8842682763026630249?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_17_01-9228318781327401740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_46-9571474969411999123?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_24_04-4357803563872978287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_32_33-16412095390173241560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_40_53-14367777739734273535?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_49_14-9714962978420022859?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_05_49-8188367098088989909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_14_18-7599937883095391120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_22_02-811830489780919668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_56-7397751901574923846?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_15_29-13573245405728031911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_24_44-5110273353827244143?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_34_47-4728283306898456764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_43_14-11523436704767514026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_51_07-4294911372391366240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_00_33-8105707540109103906?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_10_11-11549195311931631802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_17_36-5863871341686930043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_49-2619043031936227947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_14_36-3856908504160998551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_23_06-755188688300105249?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_30_26-17370729724701477405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_37_48-9141869753935729942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_45_06-9634201118502309275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_01_31-15621441034543981492?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_18_03-12243787818119862233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_25_23-4161685878566215180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_08_34-18383929463979520656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_18_40-10928742150958451539?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_26_11-17463383179431482819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_35_08-15719591870765337110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_43_30-11308643817105093240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_51_12-3019110575709382318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_17_03-17406525238655176337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_05_50-4769869976862347177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_14_14-2968557703067460047?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_23_45-10802572426385516702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_34_33-520290169982154111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_42_15-5828156344147370943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_16_51_39-796951921061763699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_17_17_30-1406196168803352853?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 5268.148s

FAILED (SKIP=6, failures=3)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 31s
216 actionable tasks: 158 executed, 54 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/rmr6syakg4eoa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #3403

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3403/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3402

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3402/display/redirect>

Changes:


------------------------------------------
[...truncated 54.74 MB...]
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-01_23_15_47-1508648853722721173'
 location: 'us-central1'
 name: 'beamapp-jenkins-0202071539-321808'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-02T07:15:48.902291Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-01_23_15_47-1508648853722721173]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-01_23_15_47-1508648853722721173
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-01_23_15_47-1508648853722721173?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-01_23_15_47-1508648853722721173 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:49.907Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-01_23_15_47-1508648853722721173. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:50.013Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-01_23_15_47-1508648853722721173.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:53.056Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:54.864Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:54.900Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:54.944Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.034Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.112Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.148Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.180Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.211Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.246Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.282Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.319Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.351Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.391Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.422Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.453Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.487Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.730Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.802Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.849Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.878Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:55.961Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:56.030Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:15:56.127Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:07.075Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:14.858Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:19.448Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:19.517Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:19.566Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:19.629Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:28.909Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:28.998Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:29.073Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:29.152Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:29.183Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:32.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:39.320Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:39.370Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:39.401Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-01_23_09_26-12084421336843602370 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16122497544307.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8d0ab28e-57ad-4919-b956-57a2e5198c8b?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anondc57cf8609a1ffc1dc10cab93fcdc834e45ed3d7/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16122497544307.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:52.241Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:16:52.276Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:17:36.173Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:17:36.211Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:17:36.266Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-01_23_10_32-50324109307050463 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:04.670Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:07.796Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:07.860Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:07.910Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:08.009Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:17.237Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:17.321Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:17.380Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:17.426Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:18:17.489Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:19:02.041Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:19:02.079Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:19:02.176Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-01_23_12_17-6355151572674506279 is in state JOB_STATE_DONE
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-01_23_06_42-8429234955222021905 after 901 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16122495919398.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/a8a317b3-2a28-4d09-a0d4-a4eaac54280f?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon0d089a80_7898_46b8_b6de_f1c280d0fe70/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(2,), (3,), (1,), (0,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:01.422Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:01.534Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:01.585Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:01.641Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:10.817Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:10.894Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:10.964Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:11.017Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:22:11.065Z: JOB_MESSAGE_BASIC: Stopping worker pool...
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16122495919398?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:23:11.380Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:23:11.422Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-02T07:23:11.463Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-01_23_15_47-1508648853722721173 is in state JOB_STATE_DONE
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4544.297s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 56s
216 actionable tasks: 154 executed, 58 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/quxatwf53e2q2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org