You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/04 09:56:33 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1034

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1034/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7362] Added a performance test for BigQueryIO write

------------------------------------------
[...truncated 257.46 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "TeamScoresDict.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s9"
        },
        "serialized_fn": "eNp1kMlOw0AMhgNlnZalZd+XU7nkKSoEVCoHKpFLFU0SpxkpycSziBYpElzgAXhhnBSEekBzmJnf9ufffmt0Q17wMAE/AJ65MOFZkYJ2Q1ndBtwxz8BNpFXp1DeU4utQKmBDej5VL90ToWHo3LzjQomLXtNxHN9MC/ATkRuNjTl+Fah1NwKq5kYqze4fhyTfVTLDJQItD0pc6XotQklrCmtqoMbVQY0X+Z+0NrAfuB7oYISsxOYIW/P9FM91LFVWTUS2n0UeyReRjxluUKPNEre63joxjRLjMag4x+3/AD8prAcxt6kZ/nyxTaDObG6h/WgWxZ2B1yaJh6HNbMqNkLmfyQhwt7/gdaqOIgNtaN0+7ToQOSjco9AahV5ql+Rl/z8vswx2KyYQzWai1R2QkcMSj7reEkG0eAU8ngNYI1Ka47cx61lVG2N4QqWnJZ55K1SZiVBJjecPzudXQwe1JuNYg8GLpMq8LPEqOe87OrB0DF6733ivxpg=",
        "user_name": "TeamScoresDict"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s11",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteTeamScoreSums/ConvertToRow.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s10"
        },
        "serialized_fn": "eNq9Vdt/1EQUnt2WW7gJ9cJFFG+4VZuAUC5aq7LlUhYWCJXGS42TZHYnbZLJmZnQLhIFcQsPvvHX+N95JrtLKQgPPvjbzWbn3L/znZm5P9YIaU5DzvyA0dTWkmaqI2Sq7FBIZjVpktAgYYuS5jmTc+JiZgGZfAC1EuoNb4wQ4ncyGAujOEls3/xafigZ1czvFFmoY4EO441N+kTQyNe9nFmwxduOIZoiYgu4hq192ObC9kar1iL41FsTzd2PCblHyMMa6dbILdjR7oM16dXQaw129mGXp/Cvw0XKnGWWrcSZGr2nVELvMGdVyBWFEJljEPo3hNJNkaax9m/0NBfZSf82k3Gn5ygZOipaUU5eyZ1n+uJs9MUxfbHzHuyuSp9JaBpEdBb2XNu1pUlgr1dHKbbktT7sm9Sw34WJTeC7TPtUa2nB61WAoIgTjdXCG942XKLaaOHNdXjLhQObXOM0F1L7qYiKBHt30DuMDq9gDw714bALb1d5fAwSat+HI+vwjgvv8j3tfyMtZLiAo3y8wZ/SMIY0HHqEqGokwm+drBPyuEaWCZl7aDh5rz2yrbfG0fZIXFG2NEHKGjH01chBfJ7UCWxF8/cnJ6sm2cfhg2qCQpHAh+vwkffX/0ElW6NpnjBDpHlr5nQpZuSikEnP1yawGnF8zLMMx6ZzxnoWPm793azvINDwxlHBEpbCZB8+QaI/deEzbwaFizLWbEGcj7s3CyZ7NlvLaRbZM4kIaaJm7dHEPCPZiD/1CGxvK0ZRWG9KwenDcY4ZTvBjfA/mJvB5lVmxpAMn+3DKKKc1nHbhDD/AD3qnnhuJEVp7hNY2aO0X0MLZPpxz4QuO4/GlCzM4Hvxo+F8iWc81wIKv8LyYLeHrBndK+Kbh7cQatdA0Gab+tpr8+fbChUsXXDhfATTxoFm14taCO9++BHPeLlysxlkkVn2lqdRwgc8VlW8uxTILNVz0Jp7inzJVT2mmcHd14VKVtTp1fB5nWsHlzQcfKiq5HTEsimohlTV/3ZxKl43YgnlEcaVdQqtRFSIKnRe6CqjgarsKH2cbomvtYh3awSBtoZj08SBlCVz39r44JHCjGjRtjlo/w7bCzSpJwmiEnoGgMgK3ghpRTRXTcMs7hqsXuu/jZhianJiePnf65Nkz06dgoQj68J0Lt/uwWILHL3OD5ntE80ODX23zqtgfA6XhJxeW+vCzC34ffimBNoa2AdqGG7bREBiVXZWz0NwBjC8VGjoudIcuHF3iDZflYERVyJSCFd4tgiVISkiXIHvlJbRYcY4sWiAwbl4C7r8dpl8y7naZxOTyZQGGJtYc69Ai0QvDJSgMpAcYYuVHAy0UbW+fGaAwLNIioeb2MsctgzutmrffZIxTHCjcCD7ugiDOmIRVVG1/OplYy9rLahlYWBfjNRYNMOFY9bCQuyX8OjhRVHyXwb1NAQodJ4hjlNiaKyQdXKsluv5Wwu/VLknjUAoF96+QR0/GVFDJRKdjRuUBN5Z/lPCQ328RFRT40fCn/Q90laaf",
        "user_name": "WriteTeamScoreSums/ConvertToRow"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s12",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "hourly_team_score_it_dataset1559638754",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "project": "apache-beam-testing",
        "schema": "{\"fields\": [{\"mode\": \"NULLABLE\", \"type\": \"INTEGER\", \"name\": \"total_score\"}, {\"mode\": \"NULLABLE\", \"type\": \"STRING\", \"name\": \"team\"}, {\"mode\": \"NULLABLE\", \"type\": \"STRING\", \"name\": \"window_start\"}]}",
        "table": "leader_board",
        "user_name": "WriteTeamScoreSums/WriteToBigQuery/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_APPEND"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/hourly_team_score_it_dataset1559638754?deleteContents=true HTTP/1.1" 204 0
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_55-10103858660637153495?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_16_49-12604982451908745963?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_25_58-2679988908635509801?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_33_58-2284532553134472598?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_00_18-16928064751176194382?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_09_38-1367281553083770379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_20_54-14739087576747958780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_29_33-11398546802334005662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_38_06-14190949752266206971?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_54-4795754090923977698?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_15_00-4957354996670786337?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_24_04-12696566513604776911?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_50-12412408326956577270?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_21_09-6119384639724897046?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_30_28-357116950493981306?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_49-9215951829431336855?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_09_52-14574295509941438298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_18_25-4058304786010423610?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_27_38-10090278680899468163?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_47-11957002260587498729?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_08_27-11869218464912898098?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_19_30-4987418804116175615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_29_03-435522211553690891?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_38_51-8563052512900453779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_47_43-10440014234048175024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_51-13085595283016277904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_09_53-10221721421115333618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_17_01-15853321549845315314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_25_30-17373792882870139935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_01_59_52-8869424282604613293?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_08_45-15088672988519262461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_16_56-17403310890513991596?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_27_41-3504050446719526575?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_02_37_51-9678706351617918866?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3440.649s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 59s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wcpcnzkjsf74k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #1037

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1037/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1036

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1036/display/redirect?page=changes>

Changes:

[bhulette] Add support for full unboxing in Convert

------------------------------------------
[...truncated 231.97 KB...]
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_17_04-11566754571435273414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_28_31-8960040604801879158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_35_19-17570231095337163928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_31-2490765474238896434?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_23_02-4553469888453626011?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_33_17-11701938318353997701?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_36-5199617966453457914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_14_58-4624472651846236280?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_23_04-14190665912129820806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_31_31-4418536811693379129?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_31-12018443840235002375?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_21_58-974782425278881163?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_30_10-8019078999727156605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_38_25-12269079842370503359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_34-8841969796957346456?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_10_55-8353482664867621753?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_19_42-16706997180344377464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_29_15-6443554992546177622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_30-13975250381680595016?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_13_18-6431557290535346877?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_24_04-1333516179818899458?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_32_37-16213622625665458401?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_41_07-17084233754807706223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_38-11968092723833179102?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_11_51-182134606532389907?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_19_23-16149057120379772081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_28_31-13163311482336046308?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_37_09-5349072943662005373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_33-16287414181447727140?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_10_57-11115471862029680220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_20_48-5727416268023359639?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_29_45-4652172381979026455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_37_42-14577344498343665425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_46_18-7440910568469327416?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3203.670s

OK (SKIP=4)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_58-2052052994358818739?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_18_05-15029413492236399076?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_29_29-18315450896011972646?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_54-7950081941019643890?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_25_11-7536477271871609388?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_34_01-5503134820473731262?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_42_24-3368176206643810117?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_59-10017110687014042277?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_15_59-13625680624107704619?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_24_06-7388183654441335173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_33_10-5301436385792221170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_55-5062615631562581825?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_19_59-14741995356199195948?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_28_22-12812045595389422046?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_37_42-14712509848451029686?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_55-1693275589847282835?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_11_10-13884286158759854119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_19_28-6907643338435859534?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_27_54-8525253083441989529?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_36_20-11507724732228267219?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_53-5248482539906834444?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_11_10-4058844690628852571?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_20_31-12596891598810786870?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_30_08-12944500822394609815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_40_51-5145924507024226047?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_49_40-16672098532310349926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_58-14934021240251504360?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_12_15-2552908115759654279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_22_21-902141807655277913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_31_33-17734895138533782598?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_01_56-15707862359754610812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_10_03-15608575294553049431?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_20_49-1641674966098296156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_30_29-9755485326308751635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_05_39_24-17134559011088283317?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3465.374s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 47s
77 actionable tasks: 61 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/hgh62ekoel7se

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1035

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1035/display/redirect?page=changes>

Changes:

[mxm] [BEAM-2943] Check existence of to-be-staged files in PipelineResources

------------------------------------------
[...truncated 857.48 KB...]
 errorResult: <ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>
 errors: [<ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>]
 state: 'DONE'>)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 778, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 784, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 851, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 594, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 666, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 441, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: ("BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>
 errors: [<ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>]
 state: 'DONE'>)

root: INFO: 2019-06-04T10:30:34.470Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-06-04T10:30:34.570Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-06-04T10:30:34.819Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 594, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 666, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 441, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: ('BigQuery jobs failed. BQ error: %s', <JobStatus
 errorResult: <ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>
 errors: [<ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>]
 state: 'DONE'>)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 778, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 784, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 851, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 594, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 666, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 441, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: ("BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>
 errors: [<ErrorProto
 message: 'Not found: Table apache-beam-testing:python_bq_file_loads_15596438586234.beam_load_2019_06_04_102945_56_69be24971b4d49af0322d52dbe9f1867_0 was not found in location US'
 reason: 'notFound'>]
 state: 'DONE'>)

root: INFO: 2019-06-04T10:30:34.909Z: JOB_MESSAGE_DEBUG: Executing failure step failure91
root: INFO: 2019-06-04T10:30:34.962Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S47:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-060410241-06040324-2pcz-harness-c1pp,
  beamapp-jenkins-060410241-06040324-2pcz-harness-c1pp,
  beamapp-jenkins-060410241-06040324-2pcz-harness-c1pp,
  beamapp-jenkins-060410241-06040324-2pcz-harness-c1pp
root: INFO: 2019-06-04T10:30:35.190Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-04T10:30:35.267Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-06-04T10:30:35.318Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-06-04T10:32:46.869Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-04T10:32:46.920Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-06-04T10:32:47.010Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-06-04_03_24_45-10798962380975659970 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_file_loads_15596438586234 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_56-8906450852702473485?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_30_15-17004204586219955033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_33_32-12584301545777290721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_42_07-4030076191844575628?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_55-9309236797326281381?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_37_43-12193550832075811381?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_49_10-14742986236429164727?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_52-4247526178043428062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_28_05-1289398443032817553?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_37_25-6221883440336745958?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_45_50-18092429094416745344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_58-429267993470055663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_35_57-3921518711122602412?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_44_52-12473941512819863803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_53_07-14340359340813793333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_54-2507920920305301789?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_23_01-4900519962920582097?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_33_49-10040513739350906835?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_45_53-10380916294556850748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_55-4071952394828065870?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_23_32-12488399620237939626?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_32_48-14473081222944629066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_42_01-15493649023428531196?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_53_44-16409755765638744858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_04_02_57-16765784777783710305?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_53-4949863236856658228?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_24_45-10798962380975659970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_33_25-13455200751161776915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_42_06-4163552481556941310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_51_34-6701394296429395111?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_13_56-13712559933541622750?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_23_54-10827390334161002846?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_33_07-17521756882446332179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_42_07-8683579225951617961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-04_03_50_16-12100235212295644076?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3509.590s

FAILED (SKIP=4, errors=2)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 17s
77 actionable tasks: 61 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/2w45fqqycm2ta

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org