You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/14 15:19:45 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1140

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1140/display/redirect?page=changes>

Changes:

[kanterov] Merge pull request #8858: [BEAM-7542] Fix faulty cast in BigQueryUtils

------------------------------------------
[...truncated 309.82 KB...]
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputff86783b-532b-4331-9ba1-200b45f8aee4",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-14T14:48:33.645366Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-14_07_48_32-936337537614733380'
 location: 'us-central1'
 name: 'beamapp-jenkins-0614144808-635226'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-14T14:48:33.645366Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-14_07_48_32-936337537614733380]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_48_32-936337537614733380?project=apache-beam-testing
root: INFO: Job 2019-06-14_07_48_32-936337537614733380 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-14T14:48:36.337Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-14T14:48:36.910Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-14T14:48:37.359Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-14T14:48:37.367Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-14T14:48:37.396Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-14T14:48:37.405Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-14T14:48:37.418Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-14T14:48:37.424Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-14T14:48:37.475Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-14T14:48:37.482Z: JOB_MESSAGE_DETAILED: Fusing consumer modify_data into ReadFromPubSub/Read
root: INFO: 2019-06-14T14:48:37.488Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into modify_data
root: INFO: 2019-06-14T14:48:37.513Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_inputff86783b-532b-4331-9ba1-200b45f8aee4 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-06-14T14:48:37.526Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-14T14:48:37.595Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-14T14:48:37.646Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-14T14:48:37.910Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-14T14:48:37.970Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-14T14:48:38.017Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-14T14:48:39.405Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_inputff86783b-532b-4331-9ba1-200b45f8aee4'.
root: INFO: 2019-06-14T14:48:40.062Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+modify_data+WriteToPubSub/Write/NativeWrite
root: WARNING: Timing out on waiting for job 2019-06-14_07_48_32-936337537614733380 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_outputff86783b-532b-4331-9ba1-200b45f8aee4.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_22-10784407606662638011?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_39_03-5318606287446233162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_47_34-3249626119159821736?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_56_07-15578756738533891572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_20-17796396428499640748?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_55_26-5953423451064282677?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_08_03_13-12385403374104062194?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_21-9798376194561672149?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_38_17-656989507490116101?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_50_20-2106441883032759387?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_59_10-17586169073790526945?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_08_07_21-8833643059629201525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_22-10318162895451741904?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_46_35-17622389546376315227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_55_27-1004385709448988982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_08_03_17-2281828420798230923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_21-2037392663774753502?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_31_13-18412359091223112728?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_44_21-6217457821619272299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_54_24-17835733460757493269?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_21-6980955854554825690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_38_56-7370299806280945719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_48_32-936337537614733380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_08_01_28-6820754695331028764?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_20-10542303735028644700?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_33_42-4247481280115924376?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_41_48-17664415606774583483?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_50_21-1843391780652762625?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_59_48-5494857999016988011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_23_24-8741347209616171102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_33_18-15326889697947981650?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_41_56-3401629003748581258?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_50_03-15635337894602207024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_07_58_32-11365751849168818473?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_08_09_10-13697910159847191772?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3429.096s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 18s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/if5lrrkoelkly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #1153

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1153/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1152

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1152/display/redirect>

------------------------------------------
[...truncated 242.32 KB...]
root: INFO: Job 2019-06-15_23_01_58-17595501665909227242 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-16T06:02:01.942Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-16T06:02:02.631Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-16T06:02:03.134Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-16T06:02:03.136Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-16T06:02:03.144Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-16T06:02:03.151Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-16T06:02:03.153Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-16T06:02:03.159Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-16T06:02:03.171Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-16T06:02:03.174Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
root: INFO: 2019-06-16T06:02:03.177Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-06-16T06:02:03.179Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2019-06-16T06:02:03.181Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-06-16T06:02:03.183Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
root: INFO: 2019-06-16T06:02:03.185Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2019-06-16T06:02:03.188Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-06-16T06:02:03.190Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-06-16T06:02:03.193Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
root: INFO: 2019-06-16T06:02:03.195Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-06-16T06:02:03.206Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-16T06:02:03.231Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-16T06:02:03.281Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-16T06:02:03.440Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-16T06:02:03.455Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-16T06:02:03.462Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-16T06:02:05.608Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-06-16T06:02:05.608Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: WARNING: Timing out on waiting for job 2019-06-15_23_01_58-17595501665909227242 after 184 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output7b2f3712-c05a-408a-86e6-a3edb5b1267a.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_55-11222335737574838846?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_18_12-9358376266661247549?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_26_48-9762237237511266104?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_35_59-10693106337510697732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_02_08-467390514616368951?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_23_39-1735401428029791617?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_33_25-6672983393762761991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_57-13181201793478992258?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_15_30-971612860496005018?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_23_11-2642098983488463871?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_32_44-1655964150552541044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_53-7378369061544349207?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_20_24-4680483179527998697?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_29_45-263541507387045814?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_36_58-2633770647122033785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_50-4054777160979023776?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_12_01-11507925010590355179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_19_13-4606050409720018482?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_26_19-7032609580676947239?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_35_21-3484223945797114082?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_55-5317603642647074818?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_10_57-9110495963834552751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_19_04-12406482737855514543?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_28_25-16012527490324905015?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_58-17595501665909227242?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_15_37-9604859984884398192?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_24_18-15809822738729793769?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_33_49-856709324285470682?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_40_39-3820087249855539246?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_51-672389857419094072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_11_13-1666582187567863047?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_20_24-4223356044101187995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_29_05-412445491157718041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_36_35-15212052843699796312?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_45_36-6167286209870675250?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3156.557s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_33-15804477625257020331?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_17_49-4451383018217893167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_26_05-13778020703593643982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_34_21-17877181158604784976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_42_26-6368000641728533131?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_30-7234297284730210373?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_31_08-17934444603927436936?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_30-10634580952335074590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_14_36-11872775755745544649?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_23_24-962694901907915662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_32_16-8343205662893310704?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_28-11923819558135266058?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_21_39-8381401348164427522?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_28_09-2357700098529562182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_35_46-9787440350687190348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_28-3163650032920876229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_11_37-10584399947297368378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_18_54-5705034664634517781?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_26_05-8483818155854997226?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_35_06-11694458662768349497?project=apache-beam-testing.
  kms_key=kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_27-3898770655045405691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_10_24-1703993980677893938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_19_50-920529487808337052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_29_34-12805438374300673?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_38_27-5728886527164039198?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_34-4364991610103072158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_11_51-12056158853914677094?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_23_21-5063515052420237307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_31_09-12434549344181535963?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_01_29-9397379700875982636?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_11_15-11276053307735992121?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_20_01-8754437902635941644?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_27_27-15977735645783978893?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_41_57-13593929174096062625?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_23_51_58-18366408101131500783?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3544.655s

OK (SKIP=5)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 5s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/d7s7nufmwwu4i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1151

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1151/display/redirect>

------------------------------------------
[...truncated 208.63 KB...]
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_36-11003257281227551279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_15_02-619831115219226691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_21_53-13785308079917059912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_32_34-9927401727560739369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_40_05-310334878643104879?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_35-2482173000364701031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_23_03-16818354967816925297?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_33_19-3709024116842275825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_34-765646337545743437?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_10_03-13174701693105895908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_20_59-16959574798632936238?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_29_27-1485306226557247493?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_33-658125000990644181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_09_50-13891527563416596539?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_18_16-17672153988001876024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_26_36-14410735922367518756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_43_23-17174172517186244496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_52_24-5796935739158336100?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_37-16382958596311880866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_09_57-1734223429918331719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_18_48-2230126411140536359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_26_49-11877376695947280859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_36_55-15242163609413468579?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_36-17314885360564115988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_09_49-8056699631047614707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_18_41-4189905883999904980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_27_26-9675556810510837834?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3584.672s

OK (SKIP=5)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_02_00-15292409834258109789?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_17_49-1198146100243266446?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_26_15-17255519063479291614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_34_36-14750953257655983768?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_57-4281112334492454441?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_21_35-953405274704770022?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_29_26-3769983808608417123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_37_28-14207071666627332322?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_59-17459062010990059402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_15_40-4926427010854799100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_23_54-5862431596106342431?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_34_35-13391662824670882419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_43_06-6999608219733281388?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_57-138800069953676218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_23_55-692921140262356260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_32_18-514535538811537751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_56-3168463610070770294?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_10_22-12941003795314040085?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_19_33-7130874555470832909?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_28_50-4991642079840605699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_43_46-6955225078970208844?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_53_19-14807803473859807385?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_55-4787551921812938282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_10_02-12767937760192640140?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_19_49-9720287850421426167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_29_04-1976150562531461468?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_58-16830398326550049546?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_10_36-16504327462823066246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_18_58-685233563059509188?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_27_05-2470974185286623708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_36_32-10739831315283980719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_01_58-2603543548767178778?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_10_57-2792095865928955419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_21_49-7225319261615842543?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_17_29_47-5707712709611646647?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3680.394s

OK (SKIP=5)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 20s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/5a2bzbcaq6ais

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1150

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1150/display/redirect>

------------------------------------------
[...truncated 297.49 KB...]
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output551794ac-fd3e-49c7-bab3-8d675ccab236",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-15T18:30:28.096266Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-15_11_30_26-13324837483121356009'
 location: 'us-central1'
 name: 'beamapp-jenkins-0615183003-559017'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-15T18:30:28.096266Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-15_11_30_26-13324837483121356009]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_30_26-13324837483121356009?project=apache-beam-testing
root: INFO: Job 2019-06-15_11_30_26-13324837483121356009 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-15T18:30:30.740Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-15T18:30:31.300Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-15T18:30:31.821Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-15T18:30:31.830Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-15T18:30:31.846Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T18:30:31.858Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-15T18:30:31.862Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-15T18:30:31.869Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-15T18:30:31.893Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-15T18:30:31.902Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromPubSub/Map(_from_proto_str) into ReadFromPubSub/Read
root: INFO: 2019-06-15T18:30:31.905Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into add_attribute
root: INFO: 2019-06-15T18:30:31.908Z: JOB_MESSAGE_DETAILED: Fusing consumer add_attribute into ReadFromPubSub/Map(_from_proto_str)
root: INFO: 2019-06-15T18:30:31.912Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
root: INFO: 2019-06-15T18:30:31.923Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input551794ac-fd3e-49c7-bab3-8d675ccab236 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-06-15T18:30:31.930Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-15T18:30:31.994Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-15T18:30:32.044Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-15T18:30:32.270Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-15T18:30:32.299Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-15T18:30:32.309Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-15T18:30:33.404Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input551794ac-fd3e-49c7-bab3-8d675ccab236'.
root: INFO: 2019-06-15T18:30:34.297Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+ReadFromPubSub/Map(_from_proto_str)+add_attribute+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
root: WARNING: Timing out on waiting for job 2019-06-15_11_30_26-13324837483121356009 after 182 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output551794ac-fd3e-49c7-bab3-8d675ccab236.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_01_59-3585840175868358482?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_16_58-5893174032935622417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_25_50-10448137890377150255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_33_02-12202435032805055075?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_02_01-7717290707993362621?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_21_47-15641435745701608262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_31_14-10157852871481345085?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_38_17-16565814868020260138?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_01_51-15047125697658115938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_15_11-5836926076053421304?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_24_22-3470060917230084047?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_02_13-12610208055534508369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_22_20-11674441573321232077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_30_26-13324837483121356009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_01_57-9713269437361854501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_11_28-11760721065541147982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_20_15-9960323299751395376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_27_51-10341745131759302600?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_41_47-7325200308570591219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_52_28-7677640766496277178?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_02_03-17302024627421663931?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_10_09-1956564109032689808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_19_42-10140575436471042637?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_29_52-8023183397998345017?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_37_49-443606952123044403?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_01_52-18155110609494625700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_11_44-8442424840451910365?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_19_07-5751286609391255608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_25_53-9842020963172215559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_32_56-10723973068390986699?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_02_06-7917549177619713377?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_13_39-6131620818783056214?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_24_37-4001393778313360983?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_35_40-4891364719938834184?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_11_43_40-6788278717363088987?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3694.568s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 37s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/tzsfpntshgf6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1149

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1149/display/redirect>

------------------------------------------
[...truncated 870.12 KB...]
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "encode.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s8"
        },
        "serialized_fn": "ref_AppliedPTransform_encode_11",
        "user_name": "encode"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s10",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s9"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_outputec43efe0-ec74-412c-9a9d-e2497e200d9d",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-15T12:01:37.332605Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-15_05_01_35-3853336893918136170'
 location: 'us-central1'
 name: 'beamapp-jenkins-0615120122-177665'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-15T12:01:37.332605Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-15_05_01_35-3853336893918136170]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_35-3853336893918136170?project=apache-beam-testing
root: INFO: Job 2019-06-15_05_01_35-3853336893918136170 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-15T12:01:39.946Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-15T12:01:40.495Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-15T12:01:40.975Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-15T12:01:40.977Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-15T12:01:40.984Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T12:01:40.990Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-15T12:01:40.992Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-15T12:01:40.996Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-15T12:01:41.006Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-15T12:01:41.008Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
root: INFO: 2019-06-15T12:01:41.010Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-06-15T12:01:41.012Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2019-06-15T12:01:41.014Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-06-15T12:01:41.016Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
root: INFO: 2019-06-15T12:01:41.018Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2019-06-15T12:01:41.021Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-06-15T12:01:41.022Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-06-15T12:01:41.024Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
root: INFO: 2019-06-15T12:01:41.027Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-06-15T12:01:41.037Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-15T12:01:41.074Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-15T12:01:41.083Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-15T12:01:41.308Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-15T12:01:41.320Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-15T12:01:41.324Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-15T12:01:43.462Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-06-15T12:01:43.463Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: WARNING: Timing out on waiting for job 2019-06-15_05_01_35-3853336893918136170 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_outputec43efe0-ec74-412c-9a9d-e2497e200d9d.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_36-919610370948394907?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_16_25-11551679750919393982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_23_52-11226911958423401782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_31_53-15679772073250652163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_40_43-12171136622613410005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_32-1170609915035496877?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_22_25-5159735722822238247?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_32_27-16665789805668274477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_34-14286574595470527304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_15_26-18001679411010116914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_23_21-15589174579919954239?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_32_29-12864123847569366346?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_32-13045508929529956216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_21_34-17240316954179861146?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_29_15-6270699311293086088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_40_57-16526104694673424474?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_33-2001134375581725340?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_09_45-3095071649897744315?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_17_51-2598578766717620483?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_24_52-917142846609977358?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_33_00-2226826670689758769?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_30-3893849941758787963?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_08_58-11782224991638827677?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_18_24-145211137259717385?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_26_08-10123594685099346484?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_35-3853336893918136170?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_14_27-6808001378441474558?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_15_42-12329291788525045137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_24_24-8618557057279137631?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_01_34-1727363321761398499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_09_36-17472442497502303123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_18_02-3265489515002133273?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_26_02-4595089912261283182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_35_23-1212254566886826749?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_05_44_28-4840349918163578059?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3068.882s

FAILED (SKIP=5, errors=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 16s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/zb4nbfke7egzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1148

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1148/display/redirect?page=changes>

Changes:

[mxm] [BEAM-7144] Fix for rescaling problem on Flink >= 1.6

------------------------------------------
[...truncated 245.95 KB...]
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-15_03_12_08-17158733178057517146]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_08-17158733178057517146?project=apache-beam-testing
root: INFO: Job 2019-06-15_03_12_08-17158733178057517146 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-15T10:12:12.842Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-15T10:12:13.410Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-15T10:12:13.893Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-15T10:12:13.895Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-15T10:12:13.901Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T10:12:13.907Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-15T10:12:13.909Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-15T10:12:13.914Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-15T10:12:13.924Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-15T10:12:13.926Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
root: INFO: 2019-06-15T10:12:13.928Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-06-15T10:12:13.930Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2019-06-15T10:12:13.932Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-06-15T10:12:13.934Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
root: INFO: 2019-06-15T10:12:13.936Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2019-06-15T10:12:13.938Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-06-15T10:12:13.940Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-06-15T10:12:13.942Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
root: INFO: 2019-06-15T10:12:13.944Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-06-15T10:12:13.953Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-15T10:12:13.976Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-15T10:12:13.987Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-15T10:12:14.076Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-15T10:12:14.086Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-15T10:12:14.090Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-15T10:12:16.246Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-06-15T10:12:16.279Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: WARNING: Timing out on waiting for job 2019-06-15_03_12_08-17158733178057517146 after 182 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output64cf5c8f-0210-487a-9226-014e1aca9427.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_06-7158682277246063700?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_27_42-62410604106786555?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_35_29-10944742701079455176?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_42_50-4139286909139999350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_03-17085368753474358422?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_32_56-3048615104822693849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_40_57-15408331119679318667?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_06-18188287173465638028?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_24_52-9576439881001917912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_35_33-12587812569523088365?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_03-15969730561214985405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_32_19-6537846638845912997?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_40_10-240266598700561593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_51_13-12381671246394710385?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_06-1397314806232225450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_20_46-5672183240856926724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_29_01-6834502944434565356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_37_01-2838155174668907412?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_46_46-941716795345594661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_55_52-8527729002634452456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_02-14951810166650449287?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_21_03-7674960570578360262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_29_05-16068881652175521066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_36_36-1452714399645992271?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_45_48-16897359757841464854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_08-17158733178057517146?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_26_29-15252852291191244681?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_34_05-2392288704385356510?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_44_26-17163985931819275894?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_05-11826162482219724014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_20_32-11713983093917097328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_29_23-2170399262065735985?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_36_59-922141197512843359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_44_21-3531748665987828811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_52_31-17159685994870910888?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3133.432s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_11-15944884169527128401?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_27_21-16963941787033013432?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_36_33-5703273398888024804?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_45_48-11942140673930952660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_58_58-13532336525458875991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_03-9246153602129899749?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_32_03-8430469476392689671?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_40_12-272375775813138847?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_04-16086157854911648428?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_24_52-6900303815939643740?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_40_15-1104300794865529244?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_03-12426837457830563893?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_30_19-1046792158959000087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_38_46-13392684035330336053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_46_38-18162448643082926233?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_04-7366319475633394792?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_26_34-17566742487432656926?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_34_24-7572372546980498578?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_44_00-4576202477870288717?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_01-1944634430376827836?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_19_33-14395633403276002257?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_29_20-15364621394577940943?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_38_39-16573580884552339739?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_48_37-11543790251710561972?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_08-949095525063210185?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_22_18-2288942321825303664?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_30_11-5605308341879051229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_36_57-10690936414350624620?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_45_49-6241411832744061912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_12_04-16437072209983287816?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_20_35-2640662825007334285?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_29_02-7917027946737053843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_36_17-9242956484681873401?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_49_12-11897193313184768442?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-15_03_57_13-3496628682447819635?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3509.943s

OK (SKIP=5)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 29s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/ldwfymen4dt2w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1147

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1147/display/redirect>

------------------------------------------
[...truncated 435.76 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_format_10",
        "user_name": "format"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s9",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "encode.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s8"
        },
        "serialized_fn": "ref_AppliedPTransform_encode_11",
        "user_name": "encode"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s10",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s9"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output3353f170-f2ef-4a37-9694-7fb0022ba641",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-15T06:01:30.989019Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-14_23_01_29-13345869475325492462'
 location: 'us-central1'
 name: 'beamapp-jenkins-0615060114-693521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-15T06:01:30.989019Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-14_23_01_29-13345869475325492462]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_29-13345869475325492462?project=apache-beam-testing
root: INFO: Job 2019-06-14_23_01_29-13345869475325492462 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-15T06:01:33.560Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-15T06:01:34.403Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-15T06:01:34.941Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-15T06:01:34.943Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-15T06:01:34.949Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T06:01:34.955Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-15T06:01:34.957Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-15T06:01:34.962Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-15T06:01:34.973Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-15T06:01:34.975Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
root: INFO: 2019-06-15T06:01:34.977Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-06-15T06:01:34.979Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2019-06-15T06:01:34.981Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-06-15T06:01:34.983Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
root: INFO: 2019-06-15T06:01:34.984Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2019-06-15T06:01:34.986Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-06-15T06:01:34.988Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-06-15T06:01:34.990Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
root: INFO: 2019-06-15T06:01:34.992Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-06-15T06:01:35Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-15T06:01:35.049Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-15T06:01:35.099Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-15T06:01:35.260Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-15T06:01:35.272Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-15T06:01:35.277Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-15T06:01:38.899Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-06-15T06:01:38.899Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: WARNING: Timing out on waiting for job 2019-06-14_23_01_29-13345869475325492462 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output3353f170-f2ef-4a37-9694-7fb0022ba641.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_32-13346695529144813204?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_17_07-17433162357332737025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_25_48-17989189059257489841?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_33_07-862995266442282201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_26-17553706526797211567?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_30_20-8205729744653938597?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_41_36-18072942135671036376?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_29-9608968508130374378?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_15_56-2612307510760121958?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_28_07-14993175488283251590?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_39_55-6944372727564690962?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_49_40-10281484627442003419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_27-14447586323835427059?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_23_43-9799768304985746147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_33_59-17640367878122782611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_44_43-5973334811290392862?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_25-1507395424419019115?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_11_12-10087160657840990589?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_20_54-6866885902894387985?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_32_54-2920441479087864752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_25-14008925582764989507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_17_58-11983007733050176212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_24_59-9716897148535616517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_36_40-1590310900121128019?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_29-13345869475325492462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_15_46-1678528834971398817?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_24_48-11240242587628376215?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_32_19-14767610932821648333?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_39_51-11029092481283047133?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_01_28-15048677427317407646?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_12_29-7749506382452111973?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_20_45-17575616834558200638?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_29_31-9675986890790352208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_47_22-15573150628296880143?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_23_56_27-98892016766837765?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3786.817s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 7s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/64opoxgjvrofs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1146/display/redirect>

------------------------------------------
[...truncated 352.98 KB...]
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s18"
        },
        "serialized_fn": "eNq9VG1z20QQPtlO2qoxJQk0KS3gFgoKUAsor6UJNE7SJiZuUEN8BYJ6ks6WGr3t6dTEQzwDZZzJ7+Cv8IEv/ChWZ6chQPqR0eik3X32ub3dvf25bLgsZa7PbYezqC4Fi7NOIqKs7iaC6w0WhswJeVuwNOViKVmJdSBzv4DWh5JBy4QQuxND2fWCMKzbxarbruBMcruTx64MEnSoGCfsYcI8W/ZSrsMYPYsUjcTjmyjD+ADOWHDWaGpNgm+pOd2oHhKyT8ivGulq5AGcaw1An6Maeu3B+QFM0Ax/TT+JuPmYxztBnB19b2Qhe8LN3UTsZHhEbhYntDeSTDaSKAqkvdGTfhLftLe4CDo9MxOumXk7mZkqvfm3vJjHeTGLvNTTHlRV6LdDFjkeW4AX1qtjDQIXaAm1mJIXBzA5J2HKgukTh+9yaTMphQ4vKQInD0KJ0cLL9AyKaC6scPEAZiyYPeEaRGkipB0lXh5i7i7Ry+jwnOrBKwO4bMEVtY+NJK60bXj1AF6z4HU6Xig55CyEWuu/6udyFOCqXzH8UUXGmlNYkT8lIYeqIn2N9BaI1I7EUvE/LFa/TPZLZL9MdspErBJZIp420nRK5CIinmqkHf9EKhIxOhF/EE3T9tYK96XtRdKvkN4k2dfI4wrZrxSMWhsYoscU+rcCPSQd9scx6UOEUXzb6Cx+J6eAYo1Qj2BDXWvRGczECgtC7tVYlnEhb9Wui9r8PK7wxgG8adAKIsIgk3BdpS3DMnAP3qLTKCxi5u8ot+U9l6dFx8Pb9BxaipZeFiIRYCg3waPkCYc5qqOwxcJ8ZH1HwrtDBHNlUY/3aBUFvpdyF/ex1c436IVnO9tHJqgr5Eg78jZVI/GQRzyW8L6ED2j6v9wRnmEjd81cBmFxQT70a83dxjWiTYyVtQn1lLWZUlWr4ndSrVe0cVzhpurQZ4f6aAAf49X5xIJP/Vn/Ep39Z5sPN6oXG8FnA/jcgls+tvUXFtz2ay3/6jbMrz+aYwNYsODLAXzVhzv0fNHuxdCx/SCWGSyenHtoUPq6x/HqMJmITF+9X1TwXqHWoYFDb6nVh2WDTiBVkss0l4owg5WWog/iY9XdVn4A9xws3KoFawNoWvD1ANb70DL8Rb8gu49kG4a/0vIV9htnGCIT3QzzUExVy1/LJTywYFNVNBWJy7MMvvU3/3WaLUXZRkp6TPnQyZ1t+K4P32/DD88d8+0g9pJdzKkO28jzYx9sQ9VkVxkwlken+Q8R+t0wcVg45MFsMWRx6BQyyCDCcrEotd0kcoKYC3CbGp1U3e7mUR6y4soUU42DhxZ1tCCzPd5heSiBH6rLJEXQ7XKBoXROC2UE0ZeGnpsjEboYjJ87EoL6XxPqJkI=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-06-15T00:15:51.616922Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-14_17_15_49-5088480149181170374'
 location: 'us-central1'
 name: 'beamapp-jenkins-0615001525-745417'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-15T00:15:51.616922Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-14_17_15_49-5088480149181170374]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_15_49-5088480149181170374?project=apache-beam-testing
root: INFO: Job 2019-06-14_17_15_49-5088480149181170374 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-15T00:15:49.590Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-06-14_17_15_49-5088480149181170374. The number of workers will be between 1 and 1000.
root: INFO: 2019-06-15T00:15:49.638Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-06-14_17_15_49-5088480149181170374.
root: INFO: 2019-06-15T00:15:53.911Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-15T00:15:54.542Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-06-15T00:15:55.070Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T00:15:55.167Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-15T00:15:55.220Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-06-15T00:15:55.284Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-15T00:15:55.442Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-15T00:15:55.614Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-15T00:15:55.663Z: JOB_MESSAGE_DETAILED: Fusing consumer row to string into read
root: INFO: 2019-06-15T00:15:55.726Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into count/CombineGlobally(CountCombineFn)/KeyWithVoid
root: INFO: 2019-06-15T00:15:55.772Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
root: INFO: 2019-06-15T00:15:55.826Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
root: INFO: 2019-06-15T00:15:55.893Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/KeyWithVoid into row to string
root: INFO: 2019-06-15T00:15:55.944Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
root: INFO: 2019-06-15T00:15:55.993Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
root: INFO: 2019-06-15T00:15:56.040Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/UnKey into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
root: INFO: 2019-06-15T00:15:56.089Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-06-15T00:15:56.134Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-06-15T00:15:56.180Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-06-15T00:15:56.230Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-06-15T00:15:56.276Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-06-15T00:15:56.313Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s13.out
root: INFO: 2019-06-15T00:15:56.369Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-06-15T00:15:56.408Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-06-15T00:15:56.438Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-06-15T00:15:56.484Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-06-15T00:15:56.535Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-06-15T00:15:56.582Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into count/CombineGlobally(CountCombineFn)/DoOnce/Read
root: INFO: 2019-06-15T00:15:56.629Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
root: INFO: 2019-06-15T00:15:56.676Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-15T00:15:56.736Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-15T00:15:56.777Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-15T00:15:56.820Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-15T00:15:57.072Z: JOB_MESSAGE_DEBUG: Executing wait step start38
root: INFO: 2019-06-15T00:15:57.184Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-06-15T00:15:57.233Z: JOB_MESSAGE_BASIC: Executing operation count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-06-15T00:15:57.271Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-15T00:15:57.312Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-06-15T00:15:57.362Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-06-15T00:15:57.380Z: JOB_MESSAGE_BASIC: Finished operation count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-06-15T00:15:57.467Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-06-15T00:15:57.512Z: JOB_MESSAGE_DEBUG: Value "count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-06-15T00:15:57.574Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-06-15T00:15:57.625Z: JOB_MESSAGE_BASIC: Executing operation read+row to string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
root: INFO: 2019-06-15T00:15:58.161Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_13865570276322363339" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_13865570276322363339".
root: INFO: 2019-06-15T00:16:28.504Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_13865570276322363339" observed total of 1 exported files thus far.
root: INFO: 2019-06-15T00:16:28.535Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_13865570276322363339"
root: INFO: 2019-06-15T00:30:09.740Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. INTERNAL_ERROR: Internal error. Please try again or contact Google Support. (Code: '3592295983775020754')
root: INFO: 2019-06-15T00:30:09.793Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-06-15T00:30:09.867Z: JOB_MESSAGE_BASIC: Finished operation read+row to string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
root: INFO: 2019-06-15T00:30:09.872Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-06-15T00:30:10.020Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-15T00:30:10.282Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-06-15T00:30:10.327Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-06-15T00:30:28.764Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-06-15T00:30:28.813Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-06-14_17_15_49-5088480149181170374 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_02_01-9682289491347336035?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_18_42-15297173255798711410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_35_44-5047631222908655650?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_57-17723855208883518823?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_31_46-7485476113418331922?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_42_02-10383443965285391948?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_59-9464698263225408620?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_15_49-5088480149181170374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_31_37-6431948827281122670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_41_06-79938790118771747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_59-2164796316076747477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_23_46-7520086269631286708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_31_48-14460598645370349837?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_41_36-7189271942717028320?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_59-14920038979398905524?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_12_04-14748387615489458080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_27_36-8858023953514179587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_38_35-831627758273875800?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_49_31-1547293659863532141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_57_22-4286252978082555188?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_58-606714839038419088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_14_10-2359788566445249390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_21_47-9581417147414061325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_30_10-5307441310200621432?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_40_27-7066394660522455917?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_58-15979414702997689179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_11_23-17188551917416200799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_20_10-13880503372311680103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_29_34-1695004558084511711?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_38_38-620505361536654403?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_46_15-9154159132161263795?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_01_58-14934778901686815017?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_14_37-9414047564585465516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_26_01-18251210151622343756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_17_36_40-10144419580959183522?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3986.904s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 32s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/iqsjbv6sltcqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1145

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1145/display/redirect?page=changes>

Changes:

[github] Add link to type hints design doc

------------------------------------------
[...truncated 242.80 KB...]
root: INFO: 2019-06-14T22:16:41.196Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into add_attribute
root: INFO: 2019-06-14T22:16:41.198Z: JOB_MESSAGE_DETAILED: Fusing consumer add_attribute into ReadFromPubSub/Map(_from_proto_str)
root: INFO: 2019-06-14T22:16:41.201Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
root: INFO: 2019-06-14T22:16:41.208Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input9722dc9e-0e52-47dd-89d0-3f2b68e28c65 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-06-14T22:16:41.212Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-14T22:16:41.266Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-14T22:16:41.316Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-14T22:16:41.417Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-14T22:16:41.430Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-14T22:16:41.435Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-14T22:16:42.157Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input9722dc9e-0e52-47dd-89d0-3f2b68e28c65'.
root: INFO: 2019-06-14T22:16:43.593Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+ReadFromPubSub/Map(_from_proto_str)+add_attribute+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
root: WARNING: Timing out on waiting for job 2019-06-14_15_16_36-10030067362736144754 after 184 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output9722dc9e-0e52-47dd-89d0-3f2b68e28c65.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_52-573700639454343764?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_02_24-7613325912405795228?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_10_21-3115902165690012565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_20_51-15811101868117739699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_50-15866613520593150697?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_06_38-222507866254694166?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_18_43-14780479889137939742?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_31_05-13428502982685190852?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_52-12208201668896896502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_00_08-6041799538303465943?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_10_08-12621833628768562962?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_19_19-9978393115055725728?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_49-5042405210749797800?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_05_55-17517631484335001885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_16_24-16971158971889703066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_50-4197220005869355671?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_58_42-6606738919577941867?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_12_00-2492218510286740510?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_20_48-17929566164009932423?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_32_58-12930671101705528061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_48-5520914782620094324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_54_49-11255087327303888218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_07_31-5937413984281562775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_16_36-10030067362736144754?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_53-7067388929138255066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_55_57-14140417103947084224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_03_27-15125157204524614754?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_16_28-3289284216264647526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_26_59-10675941391269503438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_36_49-9906508046568353691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_46_51-12727039057400858314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_56_28-12171396020762766935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_12_41-5619861821223704213?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_30_43-12656856249463586775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_39_19-16320420762623116421?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3910.509s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_09-13198749906511101824?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_02_45-2539146709006673439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_10_27-4908423878855778460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_18_48-892356613792794123?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_10-15569391334800729512?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_26_18-286790434144817145?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_36_59-11497977924568857903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_09-2968595221909499317?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_00_51-9179712221924359468?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_14_07-15205454900367072573?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_24_23-5693284544784538945?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_42_52-8548101455169750546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_08-1285747874677201415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_14_52-4084972846362882445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_27_03-6222422401442251788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_09-5155643022827734211?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_55_49-13812233800992671130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_05_26-16784784177794997382?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_16_18-7057599762809252968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_26_59-16450559857890511072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_34_11-10829353345641256060?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_08-2820183806254963504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_54_45-11560316387746022518?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_06_12-2732541766656211855?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_17_57-3039362307578161451?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_26_26-6441765754850646162?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_10-5946201796518159109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_58_53-10674486260114162009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_08_05-14192164638320731495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_20_19-5932250612712735828?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_47_09-14038742105649748374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_55_55-10627059700332359840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_04_30-12480256314494379018?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_16_37-6425513701846047370?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_15_24_59-2000108592417640944?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3991.614s

OK (SKIP=5)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 26s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/cb2c4zz66ryrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1144

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1144/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7450] Support unbounded reads with HCatalogIO

------------------------------------------
[...truncated 412.94 KB...]
          "step_name": "s10"
        },
        "serialized_fn": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s12",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "ref_AppliedPTransform_m_out_17",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-06-14T21:10:12.025591Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-14_14_10_10-3331578267800895369'
 location: 'us-central1'
 name: 'beamapp-jenkins-0614210945-622929'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-14T21:10:12.025591Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-14_14_10_10-3331578267800895369]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_10_10-3331578267800895369?project=apache-beam-testing
root: INFO: Job 2019-06-14_14_10_10-3331578267800895369 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-14T21:10:10.797Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-06-14_14_10_10-3331578267800895369. The number of workers will be between 1 and 1000.
root: INFO: 2019-06-14T21:10:10.872Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-06-14_14_10_10-3331578267800895369.
root: INFO: 2019-06-14T21:10:14.188Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-14T21:10:14.859Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-06-14T21:10:15.453Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-14T21:10:15.494Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-14T21:10:15.630Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-14T21:10:15.684Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-14T21:10:15.733Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-14T21:10:15.778Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-06-14T21:10:15.830Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-14T21:10:15.897Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-14T21:10:15.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2254>) into Create/Impulse
root: INFO: 2019-06-14T21:10:15.985Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
root: INFO: 2019-06-14T21:10:16.031Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
root: INFO: 2019-06-14T21:10:16.082Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
root: INFO: 2019-06-14T21:10:16.137Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
root: INFO: 2019-06-14T21:10:16.178Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
root: INFO: 2019-06-14T21:10:16.226Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
root: INFO: 2019-06-14T21:10:16.274Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
root: INFO: 2019-06-14T21:10:16.373Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
root: INFO: 2019-06-14T21:10:16.422Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Map(decode)
root: INFO: 2019-06-14T21:10:16.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
root: INFO: 2019-06-14T21:10:16.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
root: INFO: 2019-06-14T21:10:16.571Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2254>)
root: INFO: 2019-06-14T21:10:16.614Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
root: INFO: 2019-06-14T21:10:16.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
root: INFO: 2019-06-14T21:10:16.715Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-14T21:10:16.756Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-14T21:10:16.808Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-14T21:10:16.837Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-14T21:10:17.052Z: JOB_MESSAGE_DEBUG: Executing wait step start23
root: INFO: 2019-06-14T21:10:17.156Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
root: INFO: 2019-06-14T21:10:17.206Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-06-14T21:10:17.219Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-14T21:10:17.269Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-06-14T21:10:17.337Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
root: INFO: 2019-06-14T21:10:17.354Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-06-14T21:10:17.459Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
root: INFO: 2019-06-14T21:10:17.505Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-06-14T21:10:17.599Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:2254>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-06-14T21:36:19.574Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. INTERNAL_ERROR: Internal error. Please try again or contact Google Support. (Code: '3106564609476466998')
root: INFO: 2019-06-14T21:36:19.706Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-06-14T21:36:19.768Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:2254>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-06-14T21:36:19.916Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-14T21:36:19.997Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-06-14T21:36:20.034Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-06-14T21:36:35.652Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-06-14T21:36:35.715Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-06-14_14_10_10-3331578267800895369 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_56-11097936887077992299?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_49_25-4243789662559451601?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_57_02-4747886205051950296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_05_24-8082833087948069875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_18_07-9808711793531009970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_52-16799263842630003640?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_53-13268811300961535891?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_46_26-17265396660091823225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_05_22-3167529752356174250?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_50-12567323653838371806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_07_39-7584796403781716967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_50-558290168060868528?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_42_16-8116741746007468481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_00_39-11398830745823394756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_10_15-4880594967999667677?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_21_43-1922698498495304846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_48-4292607022515105533?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_41_26-8753024609152722336?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_51_02-4700722205718411954?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_02_31-18272213107744638994?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_10_54-3255154751439727400?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_52-15506592848266738334?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_45_04-6917804488963889384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_59_39-5043030259056262983?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_10_10-3331578267800895369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_37_08-12444624668734782471?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_32_50-2383674529707274117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_42_47-1303114738738537995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_13_50_30-17542635705675900262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_00_39-7233940791844017533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_14_10_22-1909343355487256936?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 4394.397s

FAILED (SKIP=5, errors=2)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 35s
77 actionable tasks: 62 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/bk7annxkpmkjw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1143

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1143/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7432] Set input and output files for sdist

[ehudm] Reduce gradle build verbosity

------------------------------------------
[...truncated 387.51 KB...]
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output1f9ef78c-a5c9-402f-8a92-27394af38d4f",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-14T19:29:41.109888Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-14_12_29_39-17350868692146587176'
 location: 'us-central1'
 name: 'beamapp-jenkins-0614192916-272256'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-14T19:29:41.109888Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-14_12_29_39-17350868692146587176]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_29_39-17350868692146587176?project=apache-beam-testing
root: INFO: Job 2019-06-14_12_29_39-17350868692146587176 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-14T19:29:43.647Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-14T19:29:44.290Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-14T19:29:44.801Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-14T19:29:44.803Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-14T19:29:44.810Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-14T19:29:44.815Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-14T19:29:44.817Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-14T19:29:44.820Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-14T19:29:44.833Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-14T19:29:44.835Z: JOB_MESSAGE_DETAILED: Fusing consumer modify_data into ReadFromPubSub/Read
root: INFO: 2019-06-14T19:29:44.837Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into modify_data
root: INFO: 2019-06-14T19:29:44.843Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input1f9ef78c-a5c9-402f-8a92-27394af38d4f is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-06-14T19:29:44.846Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-14T19:29:44.858Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-14T19:29:44.868Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-14T19:29:45.081Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-14T19:29:45.095Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-14T19:29:45.100Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-14T19:29:47.224Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input1f9ef78c-a5c9-402f-8a92-27394af38d4f'.
root: INFO: 2019-06-14T19:29:47.352Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+modify_data+WriteToPubSub/Write/NativeWrite
root: WARNING: Timing out on waiting for job 2019-06-14_12_29_39-17350868692146587176 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output1f9ef78c-a5c9-402f-8a92-27394af38d4f.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_06-4560926230509832187?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_21_04-17104436619119462746?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_28_51-18206302718375521454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_39_29-4934133231468491337?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_49_45-10345769167852006980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_08-10205851379203195090?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_28_56-7666156257808187382?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_39_19-5150943150221535029?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_48_07-141213530335371758?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_08-7216846808336854419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_18_11-7665197844798578324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_28_27-3688972581543237326?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_36_42-15703027007098308937?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_07-15474433470706558999?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_31_11-7567346735602453361?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_41_37-9705587529212859935?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_04-2235536624909090415?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_15_00-875692350431272052?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_23_36-16608845779512787779?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_31_36-16741846820097984425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_41_13-538125232691357724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_07-6079794052919717398?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_14_17-16612362705411193497?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_23_49-18198079035414335041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_31_27-8668699992082601087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_50_18-2073525013802192371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_58_19-14100158840995601374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_11-8051281318628847540?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_12_24-10404927978745578990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_29_39-17350868692146587176?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_48_20-15730494390139154091?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_04_07-11690850727376236052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_15_31-10335853904963109281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_24_50-10382557328931365840?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_12_33_23-15202628343887854030?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 4196.653s

FAILED (SKIP=5, errors=2)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 55s
75 actionable tasks: 58 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/auy4zc77ysgh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1142/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7531] Remove and disallow KMS key setting

[robertwb] Remove unused **kwargs arguments for various transforms.

------------------------------------------
[...truncated 575.35 KB...]
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_15_23-15465660678640410516?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_26_50-7669405018845572927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_38_32-10347318326641526779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_45_58-12301756702843328464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_20-14072105795284157487?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_20_23-13603692597074996790?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_29_57-6389836312010132132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_39_40-13118597044269044061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_19-5703811466872649732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_13_24-15410242575291270819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_20_47-12580603308550853555?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_29_24-6507609926307350287?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_38_37-2357484964527311719?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_18-1664240792123185652?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_27_46-9028372360513965918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_37_37-16403018291331286908?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_18-8507302703315258046?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_20_44-8925471508358037319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_28_16-14916901202686643140?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_37_08-6397066111242324663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_17-6576974807063229399?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_19_05-232281226859799067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_27_02-5499714157889183796?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_23-2356197008544761873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_09_16-17774438511493538603?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_17_23-4105472325077697985?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_27_46-11061296416206725762?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_35_58-11254501952795599706?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_19-2239013546877935759?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_09_35-9767006822473471818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_17_01-3031225965495321812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_26_38-4711550797244766899?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_35_38-18376900933192811059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_44_03-2919624543953021066?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3415.716s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_22-18002377293041398629?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_16_12-17867119521415025416?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_26_38-8192957581749619229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_35_14-17125650050859495897?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_17-10601571518236405445?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_25_18-16392955480504429920?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_35_02-8046056396645604297?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_24-13174026986962147989?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_14_42-8966634773504485481?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_27_49-3012615260741088261?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_38_32-15979850871067794674?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_18-2746572380876244377?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_22_40-395121357740549025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_31_12-1980728013372772796?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_40_45-2578635981267064144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_17-17541075396490125790?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_20_00-11230343049947033080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_31_41-2911513781592558307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_16-16316157104372655590?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_08_23-3077550273052333617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_17_06-6797857519168422974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_26_53-1999515229145389295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_36_59-12171759640804804122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_47_50-15229611183079058473?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_22-5849331070150558470?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_09_31-17287422817321862364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_17_01-4211360306664788726?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_26_09-3826007188470169654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_34_34-11291113739327270798?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_43_45-9549963256754790094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_00_19-6319698010616821669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_09_46-4256731089989205324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_19_59-14655707664085582779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_27_55-332466945468582831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_11_38_21-15794488437525544656?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3476.934s

OK (SKIP=5)

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 50s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/dra56o4jvxx5i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1141

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1141/display/redirect?page=changes>

Changes:

[github] Adding a Link to The Doc for Cost Estimation

------------------------------------------
[...truncated 693.62 KB...]
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-14_10_36_34-17514122066169117000/messages?startTime=2019-06-14T17%3A41%3A21.082Z&alt=json>: response: <{'content-type': 'application/json; charset=UTF-8', 'content-length': '280', 'date': 'Fri, 14 Jun 2019 17:45:35 GMT', 'server': 'ESF', 'x-frame-options': 'SAMEORIGIN', 'cache-control': 'private', 'status': '404', 'x-content-type-options': 'nosniff', '-content-encoding': 'gzip', 'vary': 'Origin, X-Origin, Referer', 'transfer-encoding': 'chunked', 'x-xss-protection': '0'}>, content <{
  "error": {
    "code": 404,
    "message": "(12a33e970ff2368c): Information about job 2019-06-14_10_36_34-17514122066169117000 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_41-5843490689298781623?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_21_30-3910016187593295565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_30_31-9488208664516528217?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_46-4850355889849517071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_06_16-17270534899428549690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_16_50-3240072540699959289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_29_07-2528761362765823685?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_41-2338958481547340939?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_12_13-3983237533031823791?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_22_58-3181098235909834368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_42-1444778419248925076?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_02_00-8689872910415803403?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_20_03-6898778309910169579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_30_45-17056488746239099378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_38_18-7777319034965885115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_40-2570801878885052667?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_01_48-199912697905912961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_12_20-6594873782330794779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_19_55-18090136175144794115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_35_14-12372024963983706071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_46-3588565912184205394?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_02_37-13029585750956129021?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_12_41-3527180740083100881?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_21_41-18116456970265562311?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_42-650002389245661040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_01_54-7053658713981411276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_09_56-18412923702574964593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_18_38-684564875913020739?project=apache-beam-testing.
Exception in thread Thread-6:
Traceback (most recent call last):
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_30_49-12062993653055486580?project=apache-beam-testing.
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_42_10-17085115418094513288?project=apache-beam-testing.
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 661, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-14_10_42_10-17085115418094513288?alt=json>: response: <{'content-type': 'application/json; charset=UTF-8', 'content-length': '279', 'date': 'Fri, 14 Jun 2019 17:45:21 GMT', 'server': 'ESF', 'x-frame-options': 'SAMEORIGIN', 'cache-control': 'private', 'status': '404', 'x-content-type-options': 'nosniff', '-content-encoding': 'gzip', 'vary': 'Origin, X-Origin, Referer', 'transfer-encoding': 'chunked', 'x-xss-protection': '0'}>, content <{
  "error": {
    "code": 404,
    "message": "(6c67a5b054bec00): Information about job 2019-06-14_10_42_10-17085115418094513288 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>


----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3544.699s

FAILED (SKIP=5, failures=2)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_24-1793383389911855287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_08_20-15186625243520138208?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_18_16-9122740393383471357?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_31_07-14221185874291173343?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_21-11681001074908560399?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_17_19-7750420256617327871?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_28_19-3989566082508159365?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_23-6760395613173639253?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_07_34-14659723483119490827?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_18_36-12348226608895731421?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_30_26-15397880787666234252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_39_22-13134076383331552143?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_21-10108266698607495335?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_11_38-16718388818244860074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_18_50-12094436833901189271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_26_23-9528296485949421865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_21-9158991251179011041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_01_12-12334986964844028497?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_10_47-15568740835814868733?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_27_54-13838910211686711427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_20-1529296984976243511?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_03_16-10322774363868629617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_11_07-14504651271393772317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_18_48-7378362447309353936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_26_54-9649527313659104935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_35_05-12835381887423891321?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_24-10482049430999691299?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_03_17-15370867853934419760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_11_09-10503378722596059943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_21_56-6219090574853004954?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_28_47-794146097197311921?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_09_52_23-17451872689024183233?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_11_21-10803972994548789290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_19_17-8676151947893141135?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-14_10_29_51-11364673631377941149?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 4009.218s

OK (SKIP=5)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 47s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/y4hm4zaiynx3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org