You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/22 00:14:08 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1201

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1201/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for FlatMap transform

------------------------------------------
[...truncated 500.26 KB...]
root: INFO: 2019-06-21T23:26:11.580Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ParseGameEventFn into ReadInputText/Read
root: INFO: 2019-06-21T23:26:11.626Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial into UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>)
root: INFO: 2019-06-21T23:26:11.673Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/WriteBundles/WriteBundles into FormatUserScoreSums
root: INFO: 2019-06-21T23:26:11.724Z: JOB_MESSAGE_DETAILED: Fusing consumer FormatUserScoreSums into UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract
root: INFO: 2019-06-21T23:26:11.767Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>) into UserScore/ParseGameEventFn
root: INFO: 2019-06-21T23:26:11.796Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/GroupByWindow into WriteUserScoreSums/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-06-21T23:26:11.842Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read
root: INFO: 2019-06-21T23:26:11.885Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/InitializeWrite into WriteUserScoreSums/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-06-21T23:26:11.928Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-21T23:26:11.980Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-21T23:26:12.033Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-21T23:26:12.079Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-21T23:26:12.264Z: JOB_MESSAGE_DEBUG: Executing wait step start35
root: INFO: 2019-06-21T23:26:12.351Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/DoOnce/Read+WriteUserScoreSums/Write/WriteImpl/InitializeWrite
root: INFO: 2019-06-21T23:26:12.390Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-06-21T23:26:12.402Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-21T23:26:12.434Z: JOB_MESSAGE_BASIC: Executing operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
root: INFO: 2019-06-21T23:26:12.455Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2019-06-21T23:26:12.514Z: JOB_MESSAGE_BASIC: Finished operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-06-21T23:26:12.514Z: JOB_MESSAGE_BASIC: Finished operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
root: INFO: 2019-06-21T23:26:12.601Z: JOB_MESSAGE_DEBUG: Value "UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized.
root: INFO: 2019-06-21T23:26:12.640Z: JOB_MESSAGE_DEBUG: Value "WriteUserScoreSums/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-06-21T23:26:12.696Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read+UserScore/ParseGameEventFn+UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:111>)+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
root: INFO: 2019-06-21T23:26:36.351Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 5.
root: INFO: 2019-06-21T23:27:10.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:27:10.331Z: JOB_MESSAGE_DETAILED: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
root: INFO: 2019-06-21T23:27:42.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:27:42.633Z: JOB_MESSAGE_DETAILED: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
root: INFO: 2019-06-21T23:27:47.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:28:04.631Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-21T23:28:04.661Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-21T23:31:47.354Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 5 to 11.
root: INFO: 2019-06-21T23:32:20.372Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 11 to 14.
root: INFO: 2019-06-21T23:32:26.036Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 7 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:32:26.080Z: JOB_MESSAGE_DETAILED: Resized worker pool to 7, though goal was 14.  This could be a quota issue.
root: INFO: 2019-06-21T23:32:31.415Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:32:31.462Z: JOB_MESSAGE_DETAILED: Resized worker pool to 8, though goal was 14.  This could be a quota issue.
root: INFO: 2019-06-21T23:32:49.804Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 14 to 17.
root: INFO: 2019-06-21T23:32:50.187Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 9 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:32:50.216Z: JOB_MESSAGE_DETAILED: Resized worker pool to 9, though goal was 17.  This could be a quota issue.
root: INFO: 2019-06-21T23:33:06.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:33:06.121Z: JOB_MESSAGE_DETAILED: Resized worker pool to 10, though goal was 17.  This could be a quota issue.
root: INFO: 2019-06-21T23:33:18.587Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 17 to 20.
root: INFO: 2019-06-21T23:33:18.972Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 12 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:33:19.021Z: JOB_MESSAGE_DETAILED: Resized worker pool to 12, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:33:24.402Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 13 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:33:24.452Z: JOB_MESSAGE_DETAILED: Resized worker pool to 13, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:34:07.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 14 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:34:07.239Z: JOB_MESSAGE_DETAILED: Resized worker pool to 14, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:34:23.244Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 15 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:34:23.293Z: JOB_MESSAGE_DETAILED: Resized worker pool to 15, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:34:44.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 17 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:34:44.813Z: JOB_MESSAGE_DETAILED: Resized worker pool to 17, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:35:06.288Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 18 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:35:06.315Z: JOB_MESSAGE_DETAILED: Resized worker pool to 18, though goal was 20.  This could be a quota issue.
root: INFO: 2019-06-21T23:36:15.237Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 19 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-21T23:36:15.269Z: JOB_MESSAGE_DETAILED: Resized worker pool to 19, though goal was 20.  This could be a quota issue.
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/8cb442ac-0733-476c-8671-f06bed0fd16d/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/8cb442ac-0733-476c-8671-f06bed0fd16d/results*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/8cb442ac\\-0733\\-476c\\-8671\\-f06bed0fd16d/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0600285530090332 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_11-18393231296655004464?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_41_57-13936456182347081822?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_49_51-4811510140510522322?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_05-14950791085179510501?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_48_49-7684473006651011518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_56_39-4643788915543203063?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_09-11554588827908664826?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_39_57-4967641292677630236?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_48_30-13098536521300452068?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_55_52-2748572335995270842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_05-16879858322100650749?project=apache-beam-testing.
Exception in thread Thread-1:
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_42_40-9263922053256811445?project=apache-beam-testing.
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_49_52-13543372502106209166?project=apache-beam-testing.
  File "/usr/lib/python3.7/threading.py", line 865, in run
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_58_20-4063051534236356371?project=apache-beam-testing.
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_16_26_05-16879858322100650749/messages?alt=json&startTime=2019-06-21T23%3A36%3A15.269Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 21 Jun 2019 23:36:52 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(447d5ecef82f6b93): Information about job 2019-06-21_16_26_05-16879858322100650749 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-2:
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_06-14621740873260598766?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_34_07-2169025856194726849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_39_59-5893037205926440655?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_48_51-9254568375266938633?project=apache-beam-testing.
Traceback (most recent call last):
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_55_47-7786098347803271628?project=apache-beam-testing.
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_03_17-4729612269441424673?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 663, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_16_34_07-2169025856194726849?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 21 Jun 2019 23:36:55 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '279', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(169f62f5c600cf44): Information about job 2019-06-21_16_34_07-2169025856194726849 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_03-14477449320749703606?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_37_08-10948634396187328522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_45_24-11953073300802833068?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_53_27-8225484149465518416?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_09-7607867728685285442?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_35_44-17655825578026409916?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_45_21-9681417173464862211?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_55_53-9032479083937632177?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_05_30-6198984688079407050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_26_08-16641650046859705348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_34_24-1950417740583095769?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_41_30-11352296029071918634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_49_06-18194048942297944660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_57_06-17731591888501809699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_06_18-2423679799971849116?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 2899.926s

FAILED (SKIP=5, failures=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 25s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/eklw3usumhw7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #1204

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1204/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1203

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1203/display/redirect>

------------------------------------------
[...truncated 955.08 KB...]
        "schema": "{\"fields\": [{\"name\": \"bytes\", \"type\": \"BYTES\", \"mode\": \"NULLABLE\"}, {\"name\": \"date\", \"type\": \"DATE\", \"mode\": \"NULLABLE\"}, {\"name\": \"time\", \"type\": \"TIME\", \"mode\": \"NULLABLE\"}]}",
        "table": "python_new_types_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-06-22T06:29:35.119094Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-21_23_29_33-537541541024632144'
 location: 'us-central1'
 name: 'beamapp-jenkins-0622062925-431468'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-22T06:29:35.119094Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-21_23_29_33-537541541024632144]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_29_33-537541541024632144?project=apache-beam-testing
root: INFO: Job 2019-06-21_23_29_33-537541541024632144 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-22T06:29:34.024Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-06-21_23_29_33-537541541024632144. The number of workers will be between 1 and 1000.
root: INFO: 2019-06-22T06:29:34.097Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-06-21_23_29_33-537541541024632144.
root: INFO: 2019-06-22T06:29:37.856Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-22T06:29:38.627Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-06-22T06:29:39.158Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-22T06:29:39.199Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-06-22T06:29:39.242Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-22T06:29:39.270Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-22T06:29:39.323Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-22T06:29:39.361Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into create/Read
root: INFO: 2019-06-22T06:29:39.402Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-22T06:29:39.440Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-22T06:29:39.485Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-22T06:29:39.531Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-22T06:29:39.694Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-06-22T06:29:39.784Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-06-22T06:29:39.842Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-22T06:29:39.882Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-06-22T06:30:21.041Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-22T06:31:21.463Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-22T06:31:21.498Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: Deleting dataset python_write_to_table_15611849642063 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_43-958838904750536145?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_15_54-15284957889634206954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_25_56-903233705683863832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_32_36-13296206689689059799?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Exception in thread Thread-10:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_23_32_36-13296206689689059799/messages?alt=json&startTime=2019-06-22T06%3A32%3A43.451Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Sat, 22 Jun 2019 06:33:48 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(89454efd4ad918a7): Information about job 2019-06-21_23_32_36-13296206689689059799 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_38-7642746719099120410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_27_06-17352115257136272866?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_23_27_06-17352115257136272866/messages?alt=json&startTime=2019-06-22T06%3A31%3A59.148Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Sat, 22 Jun 2019 06:33:55 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(a0220307ec4af7a1): Information about job 2019-06-21_23_27_06-17352115257136272866 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_42-6726334147587420530?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_13_48-4225943333173977445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_23_05-2331364864180035157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_29_55-3612927103558663733?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_40-6415914852183294312?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-3:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_22_01-7792398197643559341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_29_33-537541541024632144?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_35_55-10009510785085056653?project=apache-beam-testing.
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_23_29_33-537541541024632144/messages?alt=json&startTime=2019-06-22T06%3A31%3A21.498Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Sat, 22 Jun 2019 06:33:53 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '278', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(23cfdf523a883149): Information about job 2019-06-21_23_29_33-537541541024632144 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_40-172530233014679673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_10_54-14807073928729497957?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_19_32-14653207407488167561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_26_03-12397743942477123964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_32_29-16851655272406098810?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_38-2624258852873327602?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_09_40-7065524756107692256?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_17_21-10520368024660103877?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_24_05-8654615081181644003?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_32_33-9796238711184230398?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_40_09-16326395050374499356?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_43-16808977747846378110?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_11_05-15894841366928730958?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_19_42-12064131799556928302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_27_25-15192967089958281676?project=apache-beam-testing.
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_01_41-15075028999863126919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_10_47-401594111060610061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_19_34-8997538364818162357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_27_50-6475366265044489470?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_34_45-2138574932875876249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_41_37-3097797368069096088?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 2871.490s

FAILED (SKIP=5, errors=1, failures=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 53s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/vnpckvsz2nwno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1202

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1202/display/redirect>

------------------------------------------
[...truncated 222.26 KB...]
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_56_26-2873863551149204832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_02-17439793784928754492?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_44_23-13458023172031356958?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_54_55-1285780886910475575?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_07-5315821532718324333?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_35_09-5628573328241290326?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_45_41-17863723015352506896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_53_57-8604952701471884825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_18_03_03-5977023531506651570?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_03-586605357563823467?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_43_30-6540448748118981798?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_53_53-18084323831255560053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_18_05_33-5659692645502825724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_02-7508083448591067930?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_31_58-5343837982701772214?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_40_11-10441967251393553915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_47_32-7305987495166515583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_54_54-3790624117135401059?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_01-4201400987135012588?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_29_34-11713131150765440587?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_40_06-12485531700292904137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_48_11-16483021631244003973?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_04-13196582826271841173?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_33_34-5827223025852029854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_44_06-18350277836579130569?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_52_00-2394741278191637602?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_21_03-13697384202377586820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_29_45-11264833113978518928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_37_12-9520270703016041411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_45_26-12581927140713867958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_55_42-16838959128824445964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_18_02_59-13538149595064039756?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3135.408s

OK (SKIP=5)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_46-10278207287842843249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_37_01-12944986506124211105?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_44_12-13488896633491024547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_52_14-8305812890701121694?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_18_00_01-17357218686889367918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_43-49819272187616549?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_44_00-15769394884827264516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_52_06-7280948319201571431?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_44-9114564072429784232?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_33_43-5795465076844284658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_42_19-4165909481779984482?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_53_17-17277350326311761480?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_42-13447754892835012249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_41_04-12110630087530864235?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_48_35-270469703172281225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_56_33-14506760943633331783?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_42-10506249879788957002?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_32_25-17664152660915099016?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_44_21-6755299712814624032?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_52_19-12365294673367166542?project=apache-beam-testing.
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_40-8050509070165827870?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_29_25-11664101082233269398?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_37_12-1666274233515840830?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_45_08-13674860451337570533?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_45-13318168523943332381?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_30_14-7730031441804364956?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_37_32-15321333137295616133?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_45_08-5561800377101413235?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_53_07-3911261280275048121?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_20_43-6924450403773928367?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_29_44-16377308308093281457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_39_26-6415096636000073005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_50_06-8632044217197383384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_58_57-9960158144320925854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_18_06_43-6364819763039482378?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3464.601s

OK (SKIP=5)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 44s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/lrom6ahto2bk4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org