You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/02 20:20:46 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #124

See <https://builds.apache.org/job/beam_PostCommit_Python35/124/display/redirect?page=changes>

Changes:

[github] Retry Datastore writes on [Errno 32] Broken pipe

[ankurgoenka] [BEAM-7868] optionally skip hidden pipeline options

[udim] [BEAM-7577] Allow ValueProviders in Datastore Query filters (#8950)

------------------------------------------
[...truncated 165.55 KB...]
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_finalize_write"
          }
        ],
        "non_parallel_inputs": {
          "side0-WriteUserScoreSums/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s18"
          },
          "side1-WriteUserScoreSums/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s19"
          },
          "side2-WriteUserScoreSums/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s20"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteUserScoreSums/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV2l728YRBkn5CHLUsRvHrtOWdeMWSkLCVmSlVpO0CW3ZKmNGBeUYSesiC2DJhQVgMdiFaKVmDruUnTZnz1y90vS+m953/0B/U2eXolwmUtp8yZPnkbDc2Z2ZnXdm3gUer1gByUjAqOdTktRlTlLR4Xki6gHPqdkgcUz8mJ7OSZbR/CifS00wJp+AUh/KllsxDMPrpFAZMxJx/POJoKbXiVISR49Qr5dHkpow4W5DlSznARUCtrAJ92plQq5k1GNRKgVsHT8PLmh5PaR4ICJ5Lsz5+xdRfEKJTdiGh9ne6sNVlnsNmuKFzAqpDQowW9p8lF4RXd0qVuEaX+/1OgVGl3dFRgO4tlX4Z+A6642RSHpORtz0FnFsR+mSCe9Bjzv6cL2lY/EYJSHNYWfL3YHTThRTLyOSeVlOO9E52DVmkGdoLBX1ZRIXuC/nyxEqm21JZBQ8oIQLIxm8F/3c0Ifdlmui4aGGCgJuDMIojuueeppezEmo5Sbs0QkRMoe9A3ifA/vcLSNNuMkF/N0Vs7YtaZLVBEJJurSGya7RNKxJPhyokMLOVmqRrAUxL0J7iKh9uHPH1PQRMlOjB6eC2nTgT9dIMBPWpv2pDpmamQroDLVzKooY9QtB85pQFQTvL3z3KvScRMna8T+g41G42llMohQ+qJELeIKQCYH4DPdV3QkUk0Jy+JBWSYvEE4zkoYD9TeMK3ClB06LoKLg/zBRsN/fhgMVuZPvYTS7uMuAjeIrrFTZKfaigqpxI+Kh7M8prB6y1pSKZPHg4rPEOiq44VDKw3L0q4aoTUoQ8J1EcpV0vpT0csbYmL7u7x310Y+6PHN2iF2u3bGj6Vp2pgKtKum2sYrRMrA3mIm/LHJ02uC6RGsZa74PtF/543aaS5th59UJGsXlP3i0SmsqFmASU8VirHkTVQ0w9p5olPd5O+zB9Bg5b7oyKAovwYO20attTmM22Sma7SIStRcPnfJLF9txai2sJzIydI9O1Z3qn0iwKlmIattHqvGpHE+7ow8csneKQSAJHNlJc334Ut5gwi6f8eB/uHHbeckR7inzuGmuHIKdEYnaLNFCtZsLdFtvjbsf9CjNFHPCJAXzSgXusZqlp4H+luatx7WXDOG8Yl0vGxbLRhntbA2hMai0MKlcMCEcHcMwNUWIznlD7LE2XolSMxpqIyTK1ezxfEhgHtVUY3gIXssGTJJLewopkPL39sC3ywBbhkmoyJbH/K2x7GHY9W4E57fzOmCR+SO6G4yfvLTUMOOHeoIo+54mXF6lULbV+vnnNaRqUNZaBT61Cc1LCfQ6cHMOoS6VHpMQqaGk3fhHFEoOA+zWwuKxWYWEVPu2AM6YaJRnPpZfwsIiRcdruTtWjb8ocLA7glAMPaPMe6gbS8+D0KrgOPMiOtzZKWUBxAg8xTBjmpoy5qTS3NRuNCxLZomScLasMhTpDF0tGH6dlQxwyzuNSxQjLhpwwLho6f3KLWsNNYcVYKht5U03DCWNvW241wi3G+rrcNlq5MqnoSadi7MbhAtowsCA+Y7WaZQ1OSDsEOQ4+q6rHfRQlCw0ex1RXW5V3qgJ7oXogrPYiyaoJdk1VMoIrKa3SmKo+rJJAXX80rBJRJaiQdmMqUVslr16di3Ihq7LHR/tFlaYBL1RPKx20uP+A2H+bftbhjITPDfsojoQET7OtqgrJeSzgYXermos4CigQfUFgfsF3r8Nfx5JMrqw3GQR6OaYphJpw9YV0LM95DpTtldBxy9o2dDUUozJj2ru6ByG6BGfZnGY6b+MyXTr5WqmxwyjvKW0t7SrtLG0vVUqVMsSTWKiJAyl7kC218D7nEjIHYAC5A4I91AepCbwXpSHveQmSsOJebP9isxcYxWD66hemJmISe2/SNmFZDKDnwLlVWOnDIxI+78D5TTz12bJGbZg+dcsrloVH3X2qh9D7rHLpaZ+zo4Bnlw/BY4XPULm4BI+vc+uht8+tTzDkzAsWO8IUEV7swxcsdtdauwyprNRsNybOq4IdIIOtTjIkrUvvHGldHietJ0/+u8ROqMR+0YEvYWKfVIl9SsLTDjyjAGH/G7pn16F7bh26qbcP3fMKui+PoPtKH776Rug2uAW+hhh+XWP4jXcOwxfGMXwRiZ+dYPMMufwlRPJlB15BJF9ssXcnUX5TESV715DjtyR822Iee5gR5rOAhUwT2XdYlzGGZPVd9sJbkdWrG5PV91RNv+bA9zETr6qa/oGEHzrwowH82IGfKLL66SYU8jOm+ObnDvxiFX7Zh19J+LUDv/k/e+H19V74LX3Lr7bT2i+6NOF3WO6/78MfLH1Rq+iEJEnm4du2jy+tOfyxWdJnRdiLpIiJypS63in8CVfUXYIvnN0uzfH0f97M69oW8+jwclxcm8Jf0Ptfh594kfBGV+ffhjwxRAfN/n0zs8Md5nFN3sOg8IvvH2j0n4Uv4V/1/wANEt93",
        "user_name": "WriteUserScoreSums/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-08-02T19:24:46.775764Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-08-02_12_24_45-7982724228651244612'
 location: 'us-central1'
 name: 'beamapp-jenkins-0802192418-383241'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-08-02T19:24:46.775764Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-08-02_12_24_45-7982724228651244612]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_45-7982724228651244612?project=apache-beam-testing
root: INFO: Job 2019-08-02_12_24_45-7982724228651244612 is in state JOB_STATE_RUNNING
root: INFO: 2019-08-02T19:24:45.348Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-08-02_12_24_45-7982724228651244612. The number of workers will be between 1 and 1000.
root: INFO: 2019-08-02T19:24:45.407Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-08-02_12_24_45-7982724228651244612.
root: INFO: 2019-08-02T19:24:49.765Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-08-02T19:24:50.387Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-08-02T19:24:51.120Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-08-02T19:24:51.159Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteUserScoreSums/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-08-02T19:24:51.196Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-08-02T19:24:51.242Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-08-02T19:24:51.364Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-08-02T19:24:51.428Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-08-02T19:24:51.468Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial
root: INFO: 2019-08-02T19:24:51.519Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify
root: INFO: 2019-08-02T19:24:51.558Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/Write into WriteUserScoreSums/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-08-02T19:24:51.606Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/Reify into WriteUserScoreSums/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-08-02T19:24:51.647Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract into UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine
root: INFO: 2019-08-02T19:24:51.684Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/Extract into WriteUserScoreSums/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-08-02T19:24:51.720Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/Pair into WriteUserScoreSums/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-08-02T19:24:51.752Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/WindowInto(WindowIntoFn) into WriteUserScoreSums/Write/WriteImpl/Pair
root: INFO: 2019-08-02T19:24:51.785Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ParseGameEventFn into ReadInputText/Read
root: INFO: 2019-08-02T19:24:51.826Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial into UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>)
root: INFO: 2019-08-02T19:24:51.866Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/WriteBundles/WriteBundles into FormatUserScoreSums
root: INFO: 2019-08-02T19:24:51.914Z: JOB_MESSAGE_DETAILED: Fusing consumer FormatUserScoreSums into UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract
root: INFO: 2019-08-02T19:24:51.948Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>) into UserScore/ParseGameEventFn
root: INFO: 2019-08-02T19:24:51.987Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/GroupByWindow into WriteUserScoreSums/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-08-02T19:24:52.019Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read
root: INFO: 2019-08-02T19:24:52.066Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/InitializeWrite into WriteUserScoreSums/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-08-02T19:24:52.114Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-08-02T19:24:52.150Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-08-02T19:24:52.189Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-08-02T19:24:52.224Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-08-02T19:24:52.400Z: JOB_MESSAGE_DEBUG: Executing wait step start35
root: INFO: 2019-08-02T19:24:52.471Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/DoOnce/Read+WriteUserScoreSums/Write/WriteImpl/InitializeWrite
root: INFO: 2019-08-02T19:24:52.508Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-08-02T19:24:52.520Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-08-02T19:24:52.547Z: JOB_MESSAGE_BASIC: Executing operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
root: INFO: 2019-08-02T19:24:52.556Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-08-02T19:24:52.625Z: JOB_MESSAGE_BASIC: Finished operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
root: INFO: 2019-08-02T19:24:52.625Z: JOB_MESSAGE_BASIC: Finished operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-08-02T19:24:52.709Z: JOB_MESSAGE_DEBUG: Value "WriteUserScoreSums/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-08-02T19:24:52.747Z: JOB_MESSAGE_DEBUG: Value "UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized.
root: INFO: 2019-08-02T19:24:52.826Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read+UserScore/ParseGameEventFn+UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>)+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
root: INFO: 2019-08-02T19:24:52.960Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-08-02T19:25:03.540Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-08-02T19:25:03.658Z: JOB_MESSAGE_BASIC: Finished operation ReadInputText/Read+UserScore/ParseGameEventFn+UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>)+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
root: INFO: 2019-08-02T19:25:03.925Z: JOB_MESSAGE_WARNING: S01:WriteUserScoreSums/Write/WriteImpl/DoOnce/Read+WriteUserScoreSums/Write/WriteImpl/InitializeWrite failed.
root: INFO: 2019-08-02T19:25:03.968Z: JOB_MESSAGE_BASIC: Finished operation WriteUserScoreSums/Write/WriteImpl/DoOnce/Read+WriteUserScoreSums/Write/WriteImpl/InitializeWrite
root: INFO: 2019-08-02T19:25:04.101Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-08-02T19:25:04.170Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-08-02T19:25:04.206Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-08-02T19:26:50.784Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-08-02T19:30:14.794Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-08-02T19:30:14.839Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-08-02_12_24_45-7982724228651244612 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/5f7249a6-e02c-4cb4-ac6d-4b2fa262ce6e/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/5f7249a6-e02c-4cb4-ac6d-4b2fa262ce6e/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/5f7249a6\\-e02c\\-4cb4\\-ac6d\\-4b2fa262ce6e\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.06080484390258789 seconds.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_48-8726111018304343707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_40_50-18102015179117834663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_50_17-8893602741757768147?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_46-3115112618358541511?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_50_06-16541761071793416336?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_49-195101772682813395?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_38_17-1638860245687987465?project=apache-beam-testing.
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_48_29-10966165167969160393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_56_56-10460426905576636202?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_13_06_03-17517789869095896250?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_45-7982724228651244612?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:572: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_30_46-16163034778385617103?project=apache-beam-testing.
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_40_14-8185154583247494353?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_50_05-2025102204053399856?project=apache-beam-testing.
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_57_49-14966576497258913952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_45-10190744212383558113?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_34_02-18035267519918519141?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_42_49-9735299502590659736?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_51_17-3641950827683104663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_59_13-9538080196540156782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_44-5923051421105444017?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_33_37-15140951036839525520?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_42_54-10740196834321560088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_52_45-12794807440488503659?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_13_02_07-225338102786205436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_13_11_30-16844817524041502241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_49-15317767566270370730?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_35_06-17823832928701272804?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_44_47-16003214518759746858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_54_58-15354165176468446191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_13_04_15-11914367959836175253?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_24_46-10600432298727671927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_34_22-1079476085399932333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_46_09-16685003522611600309?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_12_57_13-15356632739987484018?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3388.343s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 49

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 24s
63 actionable tasks: 46 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/xxhhelbp4c4gc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #126

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/126/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #125

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/125/display/redirect?page=changes>

Changes:

[github] [BEAM-7060] Introduce Python3-only test modules (#9223)

------------------------------------------
[...truncated 78.85 KB...]
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... FAIL
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

======================================================================
FAIL: test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py",> line 163, in test_big_query_standard_sql
    big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py",> line 82, in run_bq_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",> line 51, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Test pipeline expected terminated in state: DONE and Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
     but: Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual checksum is da39a3ee5e6b4b0d3255bfef95601890afd80709

-------------------- >> begin captured logging << --------------------
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Query SELECT * FROM (SELECT "apple" as fruit) UNION ALL (SELECT "orange" as fruit) does not reference any tables.
root: WARNING: Dataset apache-beam-testing:temp_dataset_c61e7be7b62b46c897cafb283de42db2 does not exist so we will create it as temporary with location=None
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_query_to_table_15647872064729'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'> with schema {'fields': [{'type': 'STRING', 'mode': 'NULLABLE', 'name': 'fruit'}]}.
root: DEBUG: Created the table with id output_table
root: INFO: Created table apache-beam-testing.python_query_to_table_15647872064729.output_table with schema <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>. Result: <Table
 creationTime: 1564787209767
 etag: 'uHwWWZOtKI3oLzZ6/7O8AQ=='
 id: 'apache-beam-testing:python_query_to_table_15647872064729.output_table'
 kind: 'bigquery#table'
 lastModifiedTime: 1564787209825
 location: 'US'
 numBytes: 0
 numLongTermBytes: 0
 numRows: 0
 schema: <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>
 selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15647872064729/tables/output_table'
 tableReference: <TableReference
 datasetId: 'python_query_to_table_15647872064729'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'>
 type: 'TABLE'>.
root: DEBUG: Attempting to flush to all destinations. Total buffered: 2
root: DEBUG: Flushing data to apache-beam-testing:python_query_to_table_15647872064729.output_table. Total 2 rows.
root: DEBUG: Passed: True. Errors are []
root: INFO: Attempting to perform query SELECT fruit from `python_query_to_table_15647872064729.output_table`; to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/115895df-98fd-4623-a038-47397bc0b017?timeoutMs=10000&maxResults=0&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/115895df-98fd-4623-a038-47397bc0b017?location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon1a722a2cbf83b9e3e86059bca69e9afb580c338a/data HTTP/1.1" 200 None
root: INFO: Read from given query (SELECT fruit from `python_query_to_table_15647872064729.output_table`;), total rows 0
root: INFO: Generate checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 24.139s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_07-15370111117712695790?project=apache-beam-testing.
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_21_37-14019010911763719831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_29_59-15525752181587911911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_38_30-14353565421399896317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_04-17638081206437203240?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_29_37-3573042638991007134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_38_19-16480770366373842916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_48_53-15227158564691285868?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_05-4073225700663512003?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_21_07-162694094169933977?project=apache-beam-testing.
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_30_22-10706348634963185461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_38_11-6719599481631073043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_04-12960409954343548701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_27_12-5392296822215071926?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_36_27-3898667995004307408?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_04-5028403722067603418?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_16_20-8322977194148424366?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_24_33-11292925357405102241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_33_06-13425432054831621082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_41_14-17138749387695268821?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_03-2476192701789231754?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_15_46-11820282703218956114?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_25_59-2877201447688097398?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_34_53-2019207676708987209?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:572: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_06-18321454776120150679?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_16_43-8588040401896377457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_28_35-8882688478533717545?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_37_14-12815007848249463573?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_45_00-266529593134387151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_07_05-2327154298920257451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_16_07-1042411392168451260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_24_29-7713023118742437948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_33_40-7972382324930287361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_42_13-14806622676789338609?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-02_16_50_29-2226133842359822142?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3216.478s

OK (SKIP=5)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 49

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 34s
63 actionable tasks: 46 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/fkamlzgeh22bg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org