You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/30 19:10:16 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #2744

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/2744/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10331] Add SnowflakeIO to list of built-in IO transforms. (#12099)


------------------------------------------
[...truncated 15.14 MB...]
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWXI5+EkAApP/8H/////////////cwr///+ZrwCAAAEBAAvYZlANGGqYqeamp6IPUGGptT1GQADTQ0DQDRp6T1AHqeoBoPU9QZGTRkaCVNAmjQTTKeon6jJMTQAAAAAAAAAAAAA0aBoKaETI0GmJ6gAA0AAAAAAaNAAMgNNDQAHAAAAAAAAAAAAAAAAAAAAMjtJpzFFK4LYqyzjaFELa4VuSZq6LrLJsLgAp9yw6aAEwjBSTUyE4oiRSEJhDnlkr+AQNjGF8k++GNttBFVUVVI0QGgQgktElTZs51pkwhNCuGH3orT8g4ZUAQIYOSiGZlMMiCGAIz0OuasKyOuS9Ik8HI2BSHCiEFKMcXzQwDMHKitKyCim3MkNTwEZ9AUtsyJ+fibxgf1nfK3aotYaCkQpLJIovY7PmiTvRjxoxk1NIosScQM2oIpPHGOKAla0zdncrbnVmLLXfovDxyval1L1TvSprS8481pR3nwW5LjRT0UnvXsc4VQd3YYvCGaB08Rt9pulHKIEj0ILZuFtOO5cpK+QcL1jIqDguD44OS/8TqCWiAzig/mAiehklIPFsRKofUbX+K7pAI0G3EjKMe1tQovpjs/wTRhU1TACAFYLAIUJAHiS8Jm0ZKgmH7D+8SpKVoKnUZdIVTPwGfiHwxZzs5/VAz2lAusURJHd/Vx7TnKh4h2qbUaNcIjvyLapg9cBohcZBqAXVgwgwvllBlUMoIe3t/GoCmUxXGCuYtesFih2Qg159zLL2YbPyQzDwItV5jnsK5laCrC8stwii9NikbsSVXZRmCiVySa/ZBM96I9SjQEqaQMNj2DxyYAh3HqHox6lSlC0GtGToghKCOQE1VBHZ0GTeyAsTA9yl45wy8h/gfqKFCKoMqsD9AtKRJkLi2SSa6kXrczb6WlIwv/mlKwmnWvT2Ulf80XzRuv02tnmZc5dcJEVXUtHaa1zpmuHOauCQtV8qoiQXVVivQ4UEOmYMCPTV7rAghBHnCxDYCqKlxpwnGPHn25pghqWDG7HSooLYsHh3pNoycwEbWJXM4Wh0dyOroqfGOSZIQHLVojbrIzPU4+idbjP21swTiRZUETG3Vqeh/f9GOKGzmMK8gp22cU5fdYS4WqVI/CzElox2gikQlNrPFLJiV5HlZroCAg++qtWtW/f6DXWpu/4u5IpwoSDkc/CSA",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:36.420Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:36.462Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:36.488Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-07-30T19:02:41.271590Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-07-30_12_02_39-6185590999990343867'
 location: 'us-central1'
 name: 'beamapp-jenkins-0730190232-640309'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-07-30T19:02:41.271590Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-07-30_12_02_39-6185590999990343867]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-07-30_12_02_39-6185590999990343867
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_12_02_39-6185590999990343867?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_12_02_39-6185590999990343867?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_11_55_14-1250423987100676832 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_15961353039015.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_12_02_39-6185590999990343867 is in state JOB_STATE_RUNNING
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/225b6b26-fcc4-43be-aaec-af0b2bfcaf34?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anoneff0b2a8104fcd4203afcaf22962a9ca9d034b7b/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_15961353039015 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:48.598Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:39.723Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-07-30_12_02_39-6185590999990343867. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:39.723Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-07-30_12_02_39-6185590999990343867.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:44.877Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.661Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.695Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.732Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.760Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.828Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.872Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.909Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.940Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:45.976Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.009Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.046Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.070Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.147Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.191Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.215Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.238Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.640Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.717Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.777Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.813Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:46.944Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:47.045Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:47.129Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:51.714Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:51.789Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:51.829Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:51.880Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:59.218Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:59.305Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:59.498Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:59.541Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:02:59.571Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:03:11.347Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:03:16.235Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:03:52.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:03:52.406Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:03:52.445Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_11_56_41-8087139072439797261 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:23.804Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_11779227134680976443". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_11779227134680976443".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:34.297Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_11779227134680976443" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:35.009Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:35.073Z: JOB_MESSAGE_DEBUG: Executing success step success1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:35.183Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:35.362Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:35.388Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:54.878Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:04:54.909Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:05:28.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:05:28.725Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:05:28.766Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_11_59_13-3154641049471866776 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15961355437343.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/c975ea67-2273-4bea-8f92-b34aa3eb7e1f?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonc8940bf7bb643b09f3fbe9f7c480b1863f2fa414/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15961355437343.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:41.111Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:41.184Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:41.301Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:41.540Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:50.459Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:50.527Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:50.644Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:50.689Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:08:50.722Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:09:57.344Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:09:57.443Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-30T19:09:57.519Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_12_02_39-6185590999990343867 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 4018.584s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 49s
136 actionable tasks: 102 executed, 33 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ormwuo6owuseo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #2746

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/2746/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #2745

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/2745/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Convert katas to use with syntax rather than explicit run call.

[Robert Bradshaw] Move class definition outside of with statement.

[ningk] [BEAM-10545] KernelModel and jest tests

[Robert Bradshaw] Fix placeholder offsets.

[douglas.damon] Add Composite transforms to Go SDK katas

[jiyongjung] Bump google cloud bigquery to 1.26.1

[Rui Wang] remove redundent precommits.

[ningk] Change the syntax of private _onIOPub to a function declaration instead

[douglas.damon] Update stepik

[simonepri] Add failing test for count on an empty pcollection

[simonepri] Fix go count on an empty pcollection

[noreply] [BEAM-10559] Add apache_beam.examples.sql_taxi (#12399)


------------------------------------------
[...truncated 15.14 MB...]
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWXI5+EkAApP/8H/////////////cwr///+ZrwCAAAEBAAvYZlANGGqYqeamp6IPUGGptT1GQADTQ0DQDRp6T1AHqeoBoPU9QZGTRkaCVNAmjQTTKeon6jJMTQAAAAAAAAAAAAA0aBoKaETI0GmJ6gAA0AAAAAAaNAAMgNNDQAHAAAAAAAAAAAAAAAAAAAAMjtJpzFFK4LYqyzjaFELa4VuSZq6LrLJsLgAp9yw6aAEwjBSTUyE4oiRSEJhDnlkr+AQNjGF8k++GNttBFVUVVI0QGgQgktElTZs51pkwhNCuGH3orT8g4ZUAQIYOSiGZlMMiCGAIz0OuasKyOuS9Ik8HI2BSHCiEFKMcXzQwDMHKitKyCim3MkNTwEZ9AUtsyJ+fibxgf1nfK3aotYaCkQpLJIovY7PmiTvRjxoxk1NIosScQM2oIpPHGOKAla0zdncrbnVmLLXfovDxyval1L1TvSprS8481pR3nwW5LjRT0UnvXsc4VQd3YYvCGaB08Rt9pulHKIEj0ILZuFtOO5cpK+QcL1jIqDguD44OS/8TqCWiAzig/mAiehklIPFsRKofUbX+K7pAI0G3EjKMe1tQovpjs/wTRhU1TACAFYLAIUJAHiS8Jm0ZKgmH7D+8SpKVoKnUZdIVTPwGfiHwxZzs5/VAz2lAusURJHd/Vx7TnKh4h2qbUaNcIjvyLapg9cBohcZBqAXVgwgwvllBlUMoIe3t/GoCmUxXGCuYtesFih2Qg159zLL2YbPyQzDwItV5jnsK5laCrC8stwii9NikbsSVXZRmCiVySa/ZBM96I9SjQEqaQMNj2DxyYAh3HqHox6lSlC0GtGToghKCOQE1VBHZ0GTeyAsTA9yl45wy8h/gfqKFCKoMqsD9AtKRJkLi2SSa6kXrczb6WlIwv/mlKwmnWvT2Ulf80XzRuv02tnmZc5dcJEVXUtHaa1zpmuHOauCQtV8qoiQXVVivQ4UEOmYMCPTV7rAghBHnCxDYCqKlxpwnGPHn25pghqWDG7HSooLYsHh3pNoycwEbWJXM4Wh0dyOroqfGOSZIQHLVojbrIzPU4+idbjP21swTiRZUETG3Vqeh/f9GOKGzmMK8gp22cU5fdYS4WqVI/CzElox2gikQlNrPFLJiV5HlZroCAg++qtWtW/f6DXWpu/4u5IpwoSDkc/CSA",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-07-31T01:03:44.605804Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-07-30_18_03_43-13586307378985290057'
 location: 'us-central1'
 name: 'beamapp-jenkins-0731010333-576023'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-07-31T01:03:44.605804Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-07-30_18_03_43-13586307378985290057]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-07-30_18_03_43-13586307378985290057
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_18_03_43-13586307378985290057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_18_03_43-13586307378985290057?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_18_03_43-13586307378985290057 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:54.750Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:54.813Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:54.844Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:43.033Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-07-30_18_03_43-13586307378985290057.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:43.033Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-07-30_18_03_43-13586307378985290057. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:48.368Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.127Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.164Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.262Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.332Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.405Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.450Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.486Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.519Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.554Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.589Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.623Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.665Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.707Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.881Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.913Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:49.950Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.181Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.255Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.305Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.331Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.364Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.435Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:03:50.511Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_17_56_43-10495449710835261101 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:12.784Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:15.876Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:36.178Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:39.340Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:39.429Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:39.488Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:39.563Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:48.772Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:48.856Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:48.990Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:49.074Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:04:49.106Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:05:34.064Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:05:34.124Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:05:34.166Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_17_58_15-11105889611875467712 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:10.691Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:10.721Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:25.072Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_4455724864670129731". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_4455724864670129731".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:36.024Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_4455724864670129731" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:36.814Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:36.885Z: JOB_MESSAGE_DEBUG: Executing success step success1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:37.013Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:37.171Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:06:37.193Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:07:37.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:07:37.515Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:07:37.553Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_18_00_23-14181412189525512141 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15961572068372.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/65296d05-eb4d-458e-a117-f944f0bd65d5?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon1eeea2d219de82caa4b9fc54ee45bf94f47ed478/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15961572068372.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:11.709Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:11.813Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:11.890Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:11.954Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:21.035Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:21.096Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:21.211Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:21.278Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:10:21.316Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:11:12.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:11:12.906Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-31T01:11:12.950Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-07-30_18_03_43-13586307378985290057 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 4110.982s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 4s
136 actionable tasks: 102 executed, 33 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/dnnqeoofky2n2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org