You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/06 13:01:14 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #1158

See <https://builds.apache.org/job/beam_PostCommit_Python36/1158/display/redirect>

Changes:


------------------------------------------
[...truncated 531.57 KB...]
19/12/06 12:09:52 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/06 12:10:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1206120948-368642e7_0425292f-91f7-4d09-9dc6-bfae5dd1bbc4: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
19/12/06 12:10:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1206120948-368642e7_0425292f-91f7-4d09-9dc6-bfae5dd1bbc4 finished.
19/12/06 12:10:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 12:10:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempe9as00bw/artifactsc7s2qz_g/job_e4b43b78-839c-4d19-aac4-a862def33276/MANIFEST has 1 artifact locations
19/12/06 12:10:36 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempe9as00bw/artifactsc7s2qz_g/job_e4b43b78-839c-4d19-aac4-a862def33276/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/06 12:10:36 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1206120948-368642e7_0425292f-91f7-4d09-9dc6-bfae5dd1bbc4
19/12/06 12:10:36 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1206120948-368642e7_0425292f-91f7-4d09-9dc6-bfae5dd1bbc4
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 530, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575634237.151423940","description":"Error received from peer ipv4:127.0.0.1:42677","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575634237.151457030","description":"Error received from peer ipv4:127.0.0.1:36931","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575634237.151368091","description":"Error received from peer ipv4:127.0.0.1:39069","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575634237.151368091","description":"Error received from peer ipv4:127.0.0.1:39069","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_37-15899098130655853204?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_20_23-15482351348826284133?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_28_24-10248821030574622054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_36_59-18340649436565502043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_33-9182701165251371187?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_29_37-13169601170563382216?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_38_14-7071865194924464011?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_46_55-792218972133728643?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_35-17926809282025390562?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_19_33-11584884676326014535?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_29_08-4342487295447689705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_47_12-11238498162418077475?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_35-9194383026492960552?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_15_58-6081587234142877103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_26_21-16597816042563819589?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_35_04-1875833888520339855?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_43_56-18386918037026724196?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_34-8934262162977831226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_28_17-15382120587551147365?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_37_45-15631907418195613142?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_46_59-14919842693617247518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_33-11166298616517213783?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_14_55-10992392859686628525?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_25_18-12702309020906489594?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_35_20-1315091770352551942?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_45_58-18098573892323984178?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_35-956403600820443416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_15_32-9055589577460766320?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_24_12-8367519382079240363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_33_02-2409639131966728832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_42_07-4810080416309249203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_51_28-9668081554135718387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_05_34-9316743498171394662?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_15_41-9513802017430377664?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_24_33-11808843795680046555?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_34_05-4962235548952047139?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_04_43_26-17742957817254098029?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3356.963s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 13s
83 actionable tasks: 62 executed, 21 from cache

Publishing build scan...
https://scans.gradle.com/s/dgehnxz4tevfm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #1164

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/1164/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #1163

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/1163/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8835] Stage artifacts to BEAM-PIPELINE dir in zip

[kcweaver] [BEAM-8835] Check for leading slash in zip file paths.


------------------------------------------
[...truncated 604.41 KB...]
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s10"
        },
        "serialized_fn": "eNq1VetyFEUUnpndXBggSKIQxEtA0YnCjoIiIuG2SQAXljhEdlDD2DPTuzPZuZ2eHpJU7VYhuDEvoWVZZZW/rbLKP5SvEN5B38PTvRtiRPhnTe1M93fO+U73OV/33i8ZHsmIF1DHpSSucEaSvJmyOK94KaN6lUQRcSPaYCTLKJtN5xMdlOlvQO2CZtglRVGcZgIlzw+jqOKIt+54jBJOnWaReDxMMaBs7LBHKfEdvpZRHYbsUaSopj5dxDkM92DEglGjptYU/Gm1iaq+oXSUh2pLvaXArnoP9GlbxZBV2N2DPXaMQzNIY2ou06QdJvnW90QekXvUXElZO8f9UVNsz1lIc15N4zjkzsIaD9Lk1GkzZ56Z++3czCRi/qMc5nY5TFGOSrYGe+WKz0Ukdn1yHsZufDdUVWCfrSGKlXihB/unOYxbMLFjzy3KHcI50+FFSeAWYcRxnfCSPYJTNAsrHFiHgxZM7ggN4yxl3IlTv4iwZIfswxjwnKbByz04bMErMo+DJB53HHh1HV6z4HV7WIAUChLBVP2/2uZRnMCRoGwEg0YM1caxEb9wZQMb0VXXjnNVjjSuitZ0Sx2tU2qX2Emu+aoYN7UDiD9QG0qSlBWucUTaOvtJVcSzerGjzCpLZ7rltbGOulzulB+qqtJQoI6+Q+j3bd9PsInmb7HdQQ8bfw2lo7IfnrYmmq34CsrkaN0+iLucJ2FE/SmS55Txs1PH2NTMDL7hjXV407DL6BGFOYdjsiQ5lpj68JY9gZPLWNVLMmxu1aOZEDG8be9Ci1DpHGMpA0OGMRqn9yhM2zpObpOoGFjf4fBu34N4XNT6uL0XJ3Q1ox7mcWTmE/a+J5mdLRNUpOcAHUSbUiQ0ojFNOLzH4X17+X9WPs1Rni2z4GEkZH8ymKp9Xz2qqKPDmjoqH00dHxlXdfyO4aOpk2oZ33BK6u7Jdj7owYd4IE5b8FEwGRyyJ/8t3n6iikgEZ3rwsQVnAxTrJxacC6bqwZElmDG6cL5fzaSIXcrgQk2Vd0+OR+aiHBHXg0tFFy4bwYWaFvRBnzahiuCsAEsICoo/H/3418+PYA7xeYGXEd+D+OYfj+9v/r756+Zvjx/AlYL24KoF13rwaRdq9m5xasSV5QRhwnO4vvPWRIPEKz7FE0h4ynL92k0hlqsC1uEGXpn1ehduGpIqTLKCS74cFuoyfVrwbeyzerEOlpvXOdyyYLEHn1twuweNLthGcD0QbHeQ7QsjWKgH0vlLt79Gwlo5Vl5cyl8FiwUSLFlwV2bACxkv874dnLqUVMZSj+Y5fB3cfWqPROZxMY+3ncd3C3cJaBeaS9B67l9HI0z8dAVbq0OAPGEXlg0pjRVpwAW2nxXf99CvRKlLoj4P1jBCllieQs7CVosypEieRTFw0WdpkxQRXxxMIUWSzN4vD6ZXxEVExOkWlysFQFWNC/owRlGSOHO8NHbDBAXH0CTrE+aO36eEfKNwOfDK3w4vUBw=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2019-12-06T23:57:16.663848Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-06_15_57_15-2021396393477931944'
 location: 'us-central1'
 name: 'beamapp-jenkins-1206235704-475379'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-06T23:57:16.663848Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-06_15_57_15-2021396393477931944]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_57_15-2021396393477931944?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-06_15_57_15-2021396393477931944 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:15.469Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-06_15_57_15-2021396393477931944.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:15.469Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-06_15_57_15-2021396393477931944. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:18.898Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:20.680Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.311Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.333Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.360Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.429Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.532Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.788Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.810Z: JOB_MESSAGE_DETAILED: Unzipping flatten s7 for input s5.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.834Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.856Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.948Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:21.991Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.030Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.063Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.103Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.138Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.180Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.225Z: JOB_MESSAGE_DETAILED: Unzipping flatten s7-u13 for input s8-reify-value0-c11
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.348Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.389Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.427Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.461Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.491Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.514Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.537Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.664Z: JOB_MESSAGE_DEBUG: Executing wait step start21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.720Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.763Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.796Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.825Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.881Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.939Z: JOB_MESSAGE_BASIC: Executing operation read+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.973Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:22.983Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_3164395098598786286". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_3164395098598786286".
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:57:53.854Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:28.764Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 1250.0 in region us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:28.803Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:28.877Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:32.505Z: JOB_MESSAGE_BASIC: Finished operation read+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:32.651Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:33.081Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:33.116Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:50.614Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T23:58:50.655Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-06_15_57_15-2021396393477931944 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_read_it_test: INFO: Deleting dataset python_read_table_15756766239635 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_41-9652566696639767122?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_57_15-2021396393477931944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_59_14-15750407875062146491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_08_39-8919072668854354264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_16_24-17104260932048819615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_37-875487238441122577?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_03_11-3941184006496530438?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_13_05-18050618405932231613?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_22_21-17684774265154562603?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_40-6730769154006117426?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_55_43-1166915562728292100?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_04_28-14392671620484530941?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_13_51-2080131381516974742?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_22_19-2944207128350599321?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_38-10522316179789351776?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_04_03-13459904970061464364?project=apache-beam-testing
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_21_48-16855497136305228212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_40-17220234929966392908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_53_05-17661280064041676781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_03_20-6712750831676051819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_12_48-4879128306172251432?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_38-5695939679582177878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_52_42-4699265372238554190?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_01_34-10183957681334163665?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_10_45-13069047382513014860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_20_02-10653284928405384923?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_37-4765893285387280894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_51_38-11451838665294785612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_00_40-14263174441683622314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_09_53-7050284031021081370?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_18_33-15442448754560569744?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_42_39-10960542193679743832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_15_52_20-2755985889937156238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_01_49-5230546696315796066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_10_44-6743839450363034064?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_19_39-867354901353133607?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_16_28_03-12465061441065405976?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3265.270s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 35s
83 actionable tasks: 62 executed, 21 from cache

Publishing build scan...
https://scans.gradle.com/s/7o6ket5hvdmak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #1162

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/1162/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8882] Implement Impulse() for BundleBasedRunner.

[robertwb] [BEAM-8882] Make Create fn-api agnostic.

[robertwb] [BEAM-8882] Fully specify types for Create composite.

[robertwb] [BEAM-8882] Make Read fn-api agnostic.

[robertwb] [BEAM-8882] Cleanup always-on use_sdf_bounded_source option.

[robertwb] [BEAM-8882] Annotate ParDo and CombineValues operations with proto

[robertwb] [BEAM-8882] Unconditionally populate pipeline_proto_coder_id.

[robertwb] [BEAM-8882] Fix overly-sensitive tests.

[robertwb] Fix sdf tests from create.

[robertwb] [BEAM-8882] Avoid attaching unrecognized properties.

[robertwb] [BEAM-8882] Accommodations for JRH.

[robertwb] Minor cleanup.


------------------------------------------
[...truncated 567.61 KB...]
            "servicePath": "https://dataflow.googleapis.com"
          }
        },
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python36:beam-master-20191112"
      }
    ]
  },
  "name": "beamapp-jenkins-1206213415-688277",
  "steps": [
    {
      "kind": "ParallelRead",
      "name": "s1",
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO",
        "bigquery_flatten_results": true,
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),",
        "bigquery_use_legacy_sql": true,
        "display_data": [
          {
            "key": "source",
            "label": "Read Source",
            "namespace": "apache_beam.io.iobase.Read",
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          },
          {
            "key": "query",
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),"
          },
          {
            "key": "validation",
            "label": "Validation Enabled",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "BOOLEAN",
            "value": false
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15756680549813",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": [],
              "pipeline_proto_coder_id": "ref_Coder_RowAsDictJsonCoder_3"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"name\": \"fruit\", \"type\": \"STRING\", \"mode\": \"NULLABLE\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2019-12-06T21:34:28.468697Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-06_13_34_26-11665017573183628690'
 location: 'us-central1'
 name: 'beamapp-jenkins-1206213415-688277'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-06T21:34:28.468697Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-06_13_34_26-11665017573183628690]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_34_26-11665017573183628690?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_44-17375632200182146324?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_45_50-13053995266779629668?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_46-17959687109593603978?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_39_21-8809492175324086249?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_47_38-8593446879644508866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_55_56-3096700467211136555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_14_06_40-3113543592119042804?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_42-10753143807351981705?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_43_10-1493538249216884792?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_51_52-7165663624735832569?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_14_01_57-1780434454625847915?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_45-4859167274195283584?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_38_00-9192831246124543823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_46_41-7583135934135745483?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_55_05-7145416589920166051?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_14_03_39-4704377214082078457?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_41-4994374460532121827?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_34_26-11665017573183628690?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_34_58-15605305136392764254?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_42_35-7752274535783643130?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_50_41-10002768588801987788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_59_30-17502060410006567828?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_41-10961563439749441852?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_33_58-13150745073845490809?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_42_55-13849131766118105041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_51_41-7012078162302995555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_59_34-7312052602630847260?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_45-15018327601732132655?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_34_03-4310119349928700680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_42_08-14721107192408289955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_51_18-847549010488227591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_14_01_18-5297161758042159682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_14_09_33-2283282927835382166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_24_43-17844446074030582821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_35_34-15259141964510180692?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_45_52-10107110136983736619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_55_16-12372108651722603519?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3270.661s

FAILED (SKIP=6, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 36s
83 actionable tasks: 62 executed, 21 from cache

Publishing build scan...
https://scans.gradle.com/s/24osmekfegcy6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python36 - Build # 1161 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python36 (build #1161)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python36/1161/ to view the results.

beam_PostCommit_Python36 - Build # 1160 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python36 (build #1160)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python36/1160/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python36 #1159

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/1159/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Define the no artifacts retrieval token in proto


------------------------------------------
[...truncated 614.32 KB...]
            "output_name": "out",
            "step_name": "SideInput-s18"
          },
          "side1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s19"
          },
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s20"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/mf28QVt+zNgUjTACUQ0sNNSaul2EogWZothaROAsFZZ6tdiHoEMZbGHmVHGj3NaJ2lawqk3mzpBW2hhd6l903v+/g7+s/0zdje1LBL8xPt57MryW/mvZn3fd/3HenJihOSjISMBm1KkrrKSSo7Ik9kPRQ5tRuEc9Lm9HxOsozmJ8Xp1IbS9FNg9aHs+JVSqRR0UqhMBIkF/rWJpHbQiVPC48dp0MtjRW2Y8ncZlwLD5l2Z0RC2tfwdaMtyEVIpYTub8q/Xc9RKRgMWp0rCjsk94oCx1yOKmyRK5NI+c24RzQ9qsw07cYPXtfpgOyZUnGaFMvEkXN8yOxCFumrb1SrW4E3ton0BdjuvzkTRSyoWdrCI94U4XbLhzRh9Tx9ucPw9GKkTcxpkRLEgy2knvgQ3TkQQGXqnsr5MeIHzcrEcRzS3FxRRcfiINs6PbXATBn5LH252fBsDDz30DmFvGMWc1wN9tQMuSGTsNtxiKiBVDrcOYJ8Ht/nbxp6w3z+Hz10567qKJllNIk6kS2tY3RpNo5oSwxuVSrrZSi1WtZCLInKH2LiHj95zdObIkZkjh2aOzbg5lQXHUry1aF9NOyUJDWTR0Wm/jentv70P73DYXnYb2+/jrBJUcb7OJi2SQDKSRxLe2SyZbYYCs4YDE3AZmxzd7EWxoPI47TaEwedduMDtfTjY9m/QWetowy1owhIF7/ZvR3vtoDMaKpLpQ0ejmuig6er62gbv8fdOhuhy0R7Hccxg7Y5NPadN+qFIsNxSYm2HJbrDn0IzKZSA9/rX4WMSJ6Pq3WkA0DxyM07iFGr+Pk1w3VEpVjInMcckg5T28I58rK+bjggYJRoht1W0J1mZKppjX9ULFXP7RN4tEpqqeU5CygQ3UB1CqA4zfb2raZn73bQPRy7AUcef1pkj4w7VTFO6569ezyQZd0+PetZYYGZi6cxwyw4eTrM4XOI0WsBAZ3R/2XBPH97nGBgioggc28xxY/pJnGLDLG7s/X241/F3a/RCrQC6mXR+8AF/Pxq196zebmDaeBb3lGtFml0+DPcZIvTiNBK9IEE8NYwoR/dvpWk6jgkjbVNxwoPXeNtw3MC/HNOeDnZiovnCnBKFjCvSUDe2DR902C3+TpyvSao1CBoDOOnBKadpNUv4X2ne1LDXS6uldetyeaEEp1sDeGDauIxzgQcHcMaP0OIykVD3Ik2X4lSO7zXJyTJ1eyJfkpgXdXVawbyQqiGSJFbB/IpiIr17xpV56MpoSfeztrj/AYM7rEA9W4GHzOL3cpK0I3IfNOfOWY0SnPVv1n2diyTIi1Rp+m7sb86IpkFkJGjQWoNz0wrmPfjQBEBdqgIsIHLQM8u0i5grTAIWDKo4rEdhcQ0e9uCRCdc4yUSugkREBUdxO+/fqEnxGhKBP4APe/AREz5A31AFAXx0DT7mwQXWbG1Wr5DiD3iUYbWwMGUsTKW5o9loPKZKyrpYXi1FWJzLVr+8WpYHVq2Llaispi6XdMHUtlUciSpL5XxmtRxN7SuhbXu0bWhXO4a20VNFP3Uqe/H6tBXhTAicVrNsUo9oh6CGwmOaGP4TaJlvCM6pIVJVdKoSm656MKr2YsWqCR6/VcUIjqS0SjnVPV4dtgiNqkRWCTqkXU4VeuvS1Kun41yqquqJ8XxZpWkoCq0X2gcjHjgoD9xprnVAqWsPG5bHUkFodEvXXAnBJUT+dv1b8jikQM1Jg9WDjunVU0mmVja6GbpmmNMUmFE7c7KdynORQ8xuVXDRL5vYsGSgGJOIm9W1zkFyBVL2kFHeYHMSirlXrMaeUnmPNWXttnZZO62yVS5DNo00BA9ydoGJFp76sqVAeVAMYNmDHnu0D5e20IkVdlwO4HEPPr4Gq33oo+MTHnyiaLP72fHiCjy5oZWHr0krn2KogU877BjTwna5D5902H9XsYFejZ0YcXMoGlZzoTG1qvmzhnJxZZqhQqy/cQrxqUmFeGbuXxY7q3H+tAefQZyf0Th/FuH6nAef34Dr2Q247romuJ7TcH1hDNcX+/Cla4Hr+U3gerXGvoCgfdmA9pU3DrQXJ0F7CWWVnWVzDJXyqwjd1zz4OkL3Uov9L2ToG1qG2P+N9HxTwbccFrKIUdZhXcaYkYlvsyXGGUrBd9iLrycFL28uBd/VFP2eB99HnF/WFP0BUvSHHvxoAD/24CdaCn66hRT8zEjBzz34xRr8sg+voOOvPPj1Brd/Q1/3O+m8CYixbPgtcvl3ffi9Y+gwXAkX+MNW/sMZ9gPmdWQYBz9h/ohR/mQEGd+Au12aY4g/bxViNMU+OTxhFkc/4S8Y5K8mYSxekRSc6HrrA5bC35qWOV81sFKRJAvwnbaN7585/B2HzCdYLIPxofWP9aKt4J/1fwOgV8Kp",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2019-12-06T15:04:14.476322Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-06_07_04_12-17334453194998877103'
 location: 'us-central1'
 name: 'beamapp-jenkins-1206150401-482730'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-06T15:04:14.476322Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-06_07_04_12-17334453194998877103]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_04_12-17334453194998877103?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-06_07_04_12-17334453194998877103 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:12.080Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-06_07_04_12-17334453194998877103.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:12.080Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-06_07_04_12-17334453194998877103. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:18.482Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:19.273Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.013Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.052Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.114Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.154Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.182Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.274Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.320Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.356Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.389Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.423Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.462Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.495Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.527Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.561Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.596Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.628Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.663Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.700Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.733Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.769Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.804Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.839Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.876Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.912Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.946Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:20.980Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.349Z: JOB_MESSAGE_DEBUG: Executing wait step start26
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.420Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.471Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.471Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.503Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.514Z: JOB_MESSAGE_BASIC: Executing operation group/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.565Z: JOB_MESSAGE_BASIC: Finished operation group/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.565Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.639Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.674Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:21.752Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:04:34.572Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:02.672Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 1250.0 in region us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:02.755Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:02.824Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:03.236Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:03.273Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:03.445Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:03.515Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:03.596Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:21.293Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-06T15:06:21.326Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-06_07_04_12-17334453194998877103 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1575644640696/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1575644640696/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1575644640696\\/results[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.044794321060180664 seconds.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_31-15672243354490757734?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_11_19-4864907767473857612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_21_26-3628255653528618958?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_29_00-16773713629188136544?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_27-15845987749017144554?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_17_52-959298929210430660?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_26_26-16301775363064267255?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_34_21-15072377663164976731?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_29-13661784551793510319?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_08_54-2308632051937536793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_17_03-4352901982263185615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_24_52-14276275462301628156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_28-10739745907984242754?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_15_05-10015800443278769141?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_24_36-9831340374879966799?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_33_28-8313039699167830538?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_26-16869582593599383426?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_05_44-15682618903316494669?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_13_54-4379704164579702242?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_21_50-360714274098150705?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_30_45-1791919862282354059?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_26-5488011083939641946?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_04_12-17334453194998877103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_06_41-4859612422002232921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_15_23-9097994767158267251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_33_02-17965039527072517546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_30-11528168057418725523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_05_29-333837951530954605?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_14_41-2217840380945017527?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_23_28-4218495075860136095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_32_04-9547437726079369947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_40_51-13853256687409989780?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_06_56_29-13509663524942145714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_06_27-2936298648044748702?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_16_16-10423657154119066870?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_25_11-10023038633853642339?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_07_33_34-985573808435903279?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3204.002s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 50s
83 actionable tasks: 65 executed, 18 from cache

Publishing build scan...
https://scans.gradle.com/s/gs2ybcw4seqly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org