You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/04/20 19:54:22 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #3712

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3712/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Populate runtime parameters in SDK worker startup.

[Robert Bradshaw] [BEAM-6597] Use runner capabilities to activate short ids.

[Robert Bradshaw] Fix wording.

[Robert Bradshaw] Comment about runtime value usage.

[noreply] Add tensorboard-data-server license URL for Python container image.


------------------------------------------
[...truncated 56.76 MB...]
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "bytes_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub-ToProtobuf_6",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s4",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s3"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output93aa0698-3114-497a-84cb-330b3506ac20",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2021-04-20T19:20:38.703035Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-04-20_12_20_37-824124883457502437'
 location: 'us-central1'
 name: 'beamapp-jenkins-0420192030-672836'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-04-20T19:20:38.703035Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2021-04-20_12_20_37-824124883457502437]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2021-04-20_12_20_37-824124883457502437
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_20_37-824124883457502437?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-04-20_12_20_37-824124883457502437 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:42.500Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.161Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.193Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.358Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.408Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.439Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.476Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.508Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.549Z: JOB_MESSAGE_DETAILED: Fusing consumer modify_data into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.572Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into modify_data
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.600Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.679Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input93aa0698-3114-497a-84cb-330b3506ac20 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.711Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.766Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.793Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.826Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:44.976Z: JOB_MESSAGE_DEBUG: Executing wait step start19
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:46.316Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input93aa0698-3114-497a-84cb-330b3506ac20'.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:46.411Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+modify_data+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:46.459Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:20:46.487Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:21:20.472Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:21:36.375Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:22:10.199Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-20T19:22:10.233Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2021-04-20_12_20_37-824124883457502437 after 182 seconds
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 244
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output93aa0698-3114-497a-84cb-330b3506ac20.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_49-9774883007635015760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_46_31-13284768652900969221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_54_46-4092814936662925559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_02_40-11053919687456480905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_12_16-4954864887695950415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_20_37-824124883457502437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_30_16-18071950589588122697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_38_26-5801833663569620195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_46_29-13727044259460208535?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_47-8427927910191708149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_52_59-4192608854206374989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_01_30-4675158025221317827?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_11_10-8706106106061188778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_20_36-18405779515426476861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_28_49-18414198092469163890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_38_53-11065788443935252559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_47-5430651604120518592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_44_23-2470906983690482520?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_53_02-17806195185037732420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_02_36-13241537584446314358?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_11_09-2858785753747231054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_19_18-17572789201751394156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_27_27-11778084034970361742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_36_06-2279600465303011641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_42-15447894582652132915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_51_48-7794601983306835771?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_02_15-9208855875763224603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_11_17-14567239771295339846?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_21_10-16423471298276983614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_30_14-17334629407461262156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_38_29-17326359896525410960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_36_08-11265564023196887293?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_45_59-962751823373738556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_54_08-18116482852281688180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_12_00-17581547778094160892?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_20_13-15495842628648118172?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_28_11-12801124081330214346?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_36_33-15272458262125469292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_44_13-15348834230936015654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_41-15163126281584205708?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_42_12-4835416523164186579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_51_00-14818721158800585649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_58_59-2698493948484788938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_06_59-3764849334026583362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_14_47-6588121779355083765?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_24_13-6183754556934189871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_33_42-1512593673003414847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_43_03-9873635232215322734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_46-16793067642046914030?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_42_05-11644430812201353113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_52_21-13492532994175240575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_03_25-6326986496713091067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_12_39-6420534262980912211?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_29_22-6217939837672100440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_32_41-2721994970000610116?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_42_13-17844743025638846374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_11_51_28-7000633684084892872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_00_56-293822491062764628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_10_07-8407495461057708564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_19_23-17534285042043399354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_27_47-17900268242288387408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_12_36_54-13037657464605914533?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4935.744s

FAILED (SKIP=6, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 30m 1s
210 actionable tasks: 163 executed, 43 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/wcfxst2rz52dy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #3719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3719/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3718/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12191] Add a test for python template generation with upload_graph

[noreply] [BEAM-7372] fix wrong usage of with_traceback (#14566)

[noreply] [BEAM-7372] cleanup codes for py2 from apache_beam/transforms (#14544)

[noreply] [BEAM-2085] Fixups for Python resource hints. (#14605)


------------------------------------------
[...truncated 57.40 MB...]
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16190753893640?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:27:19.047Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:27:19.087Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:27:19.130Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-22_00_19_37-7645290781945250882 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:10.983Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/LoadJobNamePrefix+write/BigQueryBatchFileLoads/SchemaModJobNamePrefix+write/BigQueryBatchFileLoads/CopyJobNamePrefix+write/BigQueryBatchFileLoads/GenerateFilePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.122Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.152Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.183Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.217Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.258Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.316Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.318Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.361Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.378Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.402Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.424Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.431Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.458Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.470Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.487Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.507Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.530Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.551Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.599Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.639Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.669Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.732Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:11.875Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:12.620Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:12.695Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:12.953Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:13.150Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:13.214Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:13.302Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:30.521Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:30.603Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:30.657Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:30.739Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:33.858Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:33.927Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:33.990Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:34.069Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.700Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.786Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.818Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.860Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.902Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.929Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.960Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.962Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.978Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:46.994Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.011Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.030Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.060Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.064Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.097Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.132Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.170Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:47.255Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:49.370Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:49.535Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:49.584Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:49.649Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:53.984Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:59.026Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:59.089Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:59.151Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:59.208Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:59.227Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.280Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.377Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.411Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.558Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.617Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.696Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:29:58.808Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:05.536Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:05.624Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:05.732Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:05.808Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:05.915Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:06.012Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:09.694Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:09.786Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:09.869Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:09.932Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:10.013Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:10.089Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.329Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.400Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.476Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.523Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.603Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.684Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.913Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:16.993Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:17.088Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:21.664Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:21.748Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:21.805Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:21.915Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:23.937Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:24.058Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:24.154Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:24.210Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:24.250Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:50.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:50.314Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:30:50.346Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-22_00_22_38-7906584718119494353 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:31:14.708Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:31:14.765Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T07:31:14.811Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-22_00_22_34-8446980897106721657 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16190761381400.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8dc995e8-8ae2-4715-8432-5652269508f6?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8dc995e8-8ae2-4715-8432-5652269508f6?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16190761381400 in project apache-beam-testing
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4863.208s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 6s
210 actionable tasks: 165 executed, 41 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/xzgnauzyin2xe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3717/display/redirect?page=changes>

Changes:

[sychen] Add a BQ option for configuring buffering duration when auto-sharding is

[suztomo] [BEAM-11010] Upgrading google-cloud-pubsublite to 0.13.2

[suztomo] [BEAM-11010] Copying SubscriberOptions from pubsublite repo

[suztomo] [BEAM-11010] Declaring flogger-system-backend to avoid conflicts

[Kyle Weaver] [BEAM-12194] Enable SqlTransform::registerUdaf in ZetaSQL.

[Kyle Weaver] [BEAM-12194] Code style changes from review.

[Boyuan Zhang] [BEAM-12114] Dataflow should apply KAFKA_READ_OVERRIDE when it's not

[noreply] Make sdk/worker_harness_container_image fully backwards compatible


------------------------------------------
[...truncated 57.21 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:56.892Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:56.911Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.006Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.034Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.056Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.079Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.101Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:33:57.122Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:06.051Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.502Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.575Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.604Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.668Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.733Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.793Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:10.853Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.360Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.441Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.509Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.555Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.611Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:17.725Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.311Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.370Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.420Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.488Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.259Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.302Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.363Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.412Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.472Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:21.569Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.008Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.071Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.129Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.172Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.231Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.365Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.536Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.604Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:25.662Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:28.281Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:28.339Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:28.393Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.843Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.914Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.996Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:31.039Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:31.071Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:28.679Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:29.968Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.026Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.129Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.176Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:34:30.214Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:14.290Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:14.331Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:14.366Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:22.875Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:22.913Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-22T01:35:22.975Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_18_26_30-2694471045816414796 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16190547769085.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/52a8a4ad-ea2e-4fcf-b5aa-ef9f0c382611?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/52a8a4ad-ea2e-4fcf-b5aa-ef9f0c382611?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16190547769085 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_18_27_36-5058279242731266949 is in state JOB_STATE_DONE
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_42-5582476306958242115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_29_21-7475927974158679494?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_37_29-4545846299246109965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_45_42-1960387572104041515?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_53_48-10493983688835715384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_02_18-1215194444206562530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_10_55-12869755764652391068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_19_06-9194768283867685038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_41-17077390541477971808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_38_56-1922271891516995580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_49_05-407819745275033917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_58_39-7532599327321138918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_08_04-11248156054985425034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_17_12-2549582499597713454?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_26_30-2694471045816414796?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_42-4850938443566676496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_29_21-3224470604202507711?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_37_15-11667553369533898778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_45_08-1280329384866847688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_54_05-2441846244384915090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_04_13-1433443382337414514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_11_42-3622154330902785100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_19_11-9399482992123440160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_27_36-5058279242731266949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_38-544377294297811216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_36_09-10545017523425659549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_46_43-12072768559753090927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_55_49-5812186707338343745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_05_10-417672971483970965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_15_08-10108853450917176347?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_23_52-5572116194699077326?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_18_04-4296508710144870566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_27_42-3118289798575870037?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_36_27-7313074508844802959?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_45_23-9957644898410027829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_53_44-346367414185469519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_02_37-15518394595595054042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_11_11-7151244098971857831?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_19_12-17562864132995422037?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_38-13522956331487224221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_24_31-1112039266350682097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_35_01-3074524475827121628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_47_01-8469232601487067291?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_55_47-2906296103836804313?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_04_16-8666446165798474346?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_13_07-8516292443457880624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_21_56-15748175975857636565?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_40-5424593453041387688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_24_53-16676742451528294120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_34_27-1529197012627105879?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_42_32-18382106280762430863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_50_19-6924702931406167027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_58_15-13144884397334791556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_14_48-1741850235476497856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_15_37-9408043145733818730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_24_32-8827952775575204903?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_34_32-107956548266885952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_43_41-18277200868333252233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_17_53_06-15424970849329154254?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_01_55-10724745758122048564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_10_14-3161613225716554208?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_18_19_45-7740566711400721258?project=apache-beam-testing
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4821.094s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 35m 7s
210 actionable tasks: 176 executed, 30 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/x3rlen3t5cr4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3716/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12009] Copy CalcRelSplitter.

[Kyle Weaver] [BEAM-12009] Implement Calc splitting rule.

[Kyle Weaver] [BEAM-12009] Use different method signature to avoid spurious null check

[Kyle Weaver] [BEAM-12009] Move CalcRelSplitter to rel subpackage.

[Kyle Weaver] [BEAM-12009] Reduce code duplication between rules by re-implementing

[Kyle Weaver] [BEAM-12009] Only match in BeamCalcSplittingRule if ≥ 1 of its component

[anup.d] BEAM-12166:Beam Sql - Combine Accumulator return Map fails with class

[suztomo] [BEAM-8357] Upgrading auto-value to 1.8 from 1.7.4

[Kenneth Knowles] Build source release zip from RC tag

[aromanenko.dev] [BEAM-12197] TPC-DS: Fix SQL-queries syntax


------------------------------------------
[...truncated 59.44 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.830Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.862Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.895Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.920Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.943Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.978Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:30.984Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.007Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.025Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.060Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.081Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.089Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.115Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.123Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.142Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.151Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.190Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.191Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.229Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.272Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.331Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.349Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.408Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.915Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:31.992Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:32.073Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:32.236Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:32.324Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:32.383Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:49.660Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:49.750Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:49.800Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:49.855Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:53.002Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:53.089Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:53.139Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:31:53.203Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.714Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.789Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.823Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.852Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.872Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.900Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.914Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.925Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.942Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.947Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.968Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:05.977Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.005Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.012Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.053Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.091Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.115Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:06.140Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.639Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.695Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.718Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.787Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.870Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.922Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:15.987Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:18.602Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:25.702Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:25.776Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:25.832Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:25.883Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:25.945Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:26.010Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:29.756Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:29.811Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:29.867Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:29.913Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:29.977Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:30.054Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.455Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.522Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.586Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.623Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.678Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:36.746Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:37.020Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:37.088Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:37.154Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:42.824Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:42.873Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:42.930Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:42.988Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:46.048Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:46.211Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:46.335Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:46.382Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:32:46.412Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:31.664Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:31.761Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:31.811Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_12_23_48-11703234737545700746 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16190330145942.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/383c4391-2708-4310-88e9-ca6ded4458ee?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/383c4391-2708-4310-88e9-ca6ded4458ee?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16190330145942 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:59.539Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:59.603Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:59.649Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:33:59.697Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:34:08.967Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:34:09.023Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:34:09.080Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:34:09.117Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:34:09.151Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:35:08.052Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:35:08.092Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T19:35:08.128Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_12_27_06-8153357362329847865 is in state JOB_STATE_DONE
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 5109.678s

OK (SKIP=6)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 34m 51s
210 actionable tasks: 173 executed, 33 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/eeet4qquos5ne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3715/display/redirect>

Changes:


------------------------------------------
[...truncated 58.07 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:21.992Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.026Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.062Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.106Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.133Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.165Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.187Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.200Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.211Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.238Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.264Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.276Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.286Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.308Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.342Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.344Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.363Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.377Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.407Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.442Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.522Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.546Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:22.616Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:24.505Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:24.716Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:24.786Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:25.034Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:25.129Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:25.187Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:43.175Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:43.245Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:43.296Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:43.360Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:46.893Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:46.957Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:47.016Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:47.080Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:23:59.969Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.023Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.059Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.115Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.144Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.164Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.185Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.200Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.242Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.255Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.273Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.290Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.313Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.334Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.343Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.368Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.403Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:00.428Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:09.289Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:13.715Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:13.786Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:13.819Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:13.912Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:13.966Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:14.037Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:14.117Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:20.994Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:21.088Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:21.174Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:21.248Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:21.297Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:21.363Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.285Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.338Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.393Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.447Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.527Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:24.593Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.412Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.476Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.546Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.598Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.665Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.727Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.928Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:26.994Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:27.063Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:29.959Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:30.087Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:30.207Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:30.292Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:31.134Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:31.199Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:31.269Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:31.321Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:24:31.356Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:25:19.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:25:19.539Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:25:19.573Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_06_16_52-16600357030803413545 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16190109965639.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/1424c1cd-da17-452f-918c-b4adbcad89e7?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/1424c1cd-da17-452f-918c-b4adbcad89e7?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16190109965639 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:01.230Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:01.327Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:01.404Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:01.454Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:10.810Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:10.913Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:10.988Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:11.051Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:27:11.077Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:28:01.489Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:28:01.529Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T13:28:01.553Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_06_20_01-2379951388918285974 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 5063.612s

OK (SKIP=6)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 27m 46s
210 actionable tasks: 150 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/oeirh5rft6x4m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #3714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3714/display/redirect?page=changes>

Changes:

[noreply] Update stringSlice to impl pkg.go.dev/flag#Getter.Get

[noreply] [BEAM-12174] Samza Portable Runner Support (#14554)


------------------------------------------
[...truncated 57.10 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.143Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.202Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.223Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.279Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.308Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.336Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.353Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.357Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.379Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.382Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.397Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.403Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.442Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.453Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.466Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.501Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.530Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:05.556Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:14.280Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.819Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.859Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.955Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.589Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.651Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.674Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.739Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.792Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.854Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:18.917Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_00_13_42-7346528455194327423 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:25.786Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:25.837Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:25.902Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:25.969Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:26.036Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:26.085Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:29.805Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:29.851Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:29.899Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:29.966Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:30.022Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:30.082Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:33.908Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:33.986Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.046Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.091Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.166Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.219Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.412Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.474Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:34.528Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.094Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.175Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.217Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.285Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.552Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.614Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.692Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.729Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:21:35.778Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:22:17.397Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:22:17.437Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-21T07:22:17.457Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-21_00_12_52-8819735790751464427 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16189891591891.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/f00b4cd7-1926-416e-a40a-3d7a49ea2c31?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/f00b4cd7-1926-416e-a40a-3d7a49ea2c31?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16189891591891 in project apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_33-9496142770253387657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_16_05-12323175061580903054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_23_49-16018957866940344487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_31_26-1190314435856141117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_40_13-12385802841403462619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_50_01-7576690479062202287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_57_34-6285454349658344073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_05_06-13562212406973931425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_32-9979577579507614850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_25_41-18193948079436957756?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_34_32-9381427329528889910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_42_53-9453536465657951170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_51_32-8284033389782606317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_59_46-16012286644607935625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_08_13-14138282266779948271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_32-11870323573130252035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_14_06-17301306516284985263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_21_55-14394184438758215818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_30_09-4230705495085361540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_38_29-11310731854890897259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_46_26-13996460231215564072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_54_46-2023227254392878198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_02_35-1163992194790043911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_29-10108043861286873558?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_23_07-17753184730724653581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_31_43-10609826144332138666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_40_27-13343587814187328034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_48_46-4373657566612453394?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_56_55-14909408809595809828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_05_24-12469040092658957278?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_13_42-7346528455194327423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_04_19-16065319252190917660?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_13_14-5942967566789730429?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_22_00-15921003626908722003?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_33_20-13526659054366244623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_42_56-6470195116738894614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_52_07-8787795599632321363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_01_11-7976190358712232259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_09_56-14099751646743306920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_30-14660847502164138978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_11_52-8365108592536663872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_23_15-3689849642433994497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_34_48-8297866821674069458?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_44_05-3219588398006468040?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_00_50-1783797924953598818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_32-8239351581302643912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_11_57-11450302819287989392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_21_01-13384784154286820000?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_28_53-5107424350005296668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_36_43-12648077745471763044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_44_41-18328471965456103978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_53_33-4438678515534897688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_03_25-2078709159338648636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_12_52-8819735790751464427?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_02_31-15830685998063023093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_12_03-8977352033113675964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_21_05-12576918967080124371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_30_33-6035654342414170872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_39_53-1481750063126486350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_49_37-5302598161819772882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-20_23_57_44-595368417017891522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-21_00_06_50-9069243810325534462?project=apache-beam-testing
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4824.018s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 15s
210 actionable tasks: 150 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/usqwhpcpdj5eq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 3713 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 3713 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/3713/ to view the results.