You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/14 00:42:29 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #989

See <https://builds.apache.org/job/beam_PostCommit_Python36/989/display/redirect?page=changes>

Changes:

[kirillkozlov] Created a MongoDbTable and a provider for it

[kirillkozlov] [SQL] Implemented write functionality for MongoDbTable, updated

[kirillkozlov] spotlesApply

[kirillkozlov] ToJson should support logical types

[kirillkozlov] Added RowJsonTest for lofical types


------------------------------------------
[...truncated 559.40 KB...]
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573689094.758980017","description":"Error received from peer ipv4:127.0.0.1:45899","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 607, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573689094.758942413","description":"Error received from peer ipv4:127.0.0.1:39907","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_18-8487417414048868764?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_04_29-16431337957815962176?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_12_10-76926037268749475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_19_45-15831289352896402195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_27_18-805721596976429951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_16-12503179732977812714?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_13_34-16502012432304503848?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_21_51-9970733606742144755?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_29_47-3205480505732144348?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_17-17487171326934125434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_02_53-3097380415094331187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_11_22-3607881339517771253?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-13818487076606140604?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_09_18-17309275289632105263?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_21-13709422992952475401?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_26_07-10239321693077747784?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-4544109910891428035?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_58_43-13785548934713492743?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_08_56-17041015389770448863?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_43-15785811389211781364?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_27_55-8814529213526684042?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-12349609192755690759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_30-17760500035841430600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_07_34-5175807771389470595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_16_31-11445525652022241929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_25_06-18217259726142285840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_34_16-6393408866674417789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_16-3930703230832291457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_34-14358804236747782894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_08_10-14550330959125886823?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_16_27-982999766609357360?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_24_39-11983473404674629501?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_15-8205326846623720335?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_51-9204801903114631675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_09_29-13301100686002300133?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_46-4594808160034192914?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_28_20-8341259082287408629?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3151.066s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 0s
81 actionable tasks: 75 executed, 6 from cache

Publishing build scan...
https://gradle.com/s/wuxp2vwavf5w4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #992

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/992/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python36 - Build # 991 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python36 (build #991)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python36/991/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python36 #990

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/990/display/redirect>

Changes:


------------------------------------------
[...truncated 555.33 KB...]
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 22.8 KB, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 9.9 KB, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:39845 (size: 9.9 KB, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:29 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 4 (MapPartitionsRDD[33] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:29 INFO TaskSchedulerImpl: Adding task set 4.0 with 4 tasks
19/11/14 01:19:29 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 16, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 1.0 in stage 4.0 (TID 17, localhost, executor driver, partition 1, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 2.0 in stage 4.0 (TID 18, localhost, executor driver, partition 2, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 3.0 in stage 4.0 (TID 19, localhost, executor driver, partition 3, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO Executor: Running task 0.0 in stage 4.0 (TID 16)
19/11/14 01:19:29 INFO Executor: Running task 2.0 in stage 4.0 (TID 18)
19/11/14 01:19:29 INFO Executor: Running task 1.0 in stage 4.0 (TID 17)
19/11/14 01:19:29 INFO Executor: Running task 3.0 in stage 4.0 (TID 19)
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 4 non-empty blocks including 4 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_3 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_3 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO Executor: Finished task 3.0 in stage 4.0 (TID 19). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 3.0 in stage 4.0 (TID 19) in 44 ms on localhost (executor driver) (1/4)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_1 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_2 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_1 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_2 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_0 stored as values in memory (estimated size 920.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_0 in memory on localhost:39845 (size: 920.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO Executor: Finished task 2.0 in stage 4.0 (TID 18). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO Executor: Finished task 1.0 in stage 4.0 (TID 17). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO Executor: Finished task 0.0 in stage 4.0 (TID 16). 11065 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 2.0 in stage 4.0 (TID 18) in 63 ms on localhost (executor driver) (2/4)
19/11/14 01:19:29 INFO TaskSetManager: Finished task 1.0 in stage 4.0 (TID 17) in 65 ms on localhost (executor driver) (3/4)
19/11/14 01:19:29 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 16) in 65 ms on localhost (executor driver) (4/4)
19/11/14 01:19:29 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 
19/11/14 01:19:29 INFO DAGScheduler: ResultStage 4 (collect at BoundedDataset.java:76) finished in 0.074 s
19/11/14 01:19:29 INFO DAGScheduler: Job 1 finished: collect at BoundedDataset.java:76, took 0.605815 s
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 980.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:39845 (size: 980.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 6 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 288.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 802.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:39845 (size: 802.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 7 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 288.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 802.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:39845 (size: 802.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 8 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO SparkContext: Starting job: collect at BoundedDataset.java:76
19/11/14 01:19:29 INFO DAGScheduler: Got job 2 (collect at BoundedDataset.java:76) with 4 output partitions
19/11/14 01:19:29 INFO DAGScheduler: Final stage: ResultStage 5 (collect at BoundedDataset.java:76)
19/11/14 01:19:29 INFO DAGScheduler: Parents of final stage: List()
19/11/14 01:19:29 INFO DAGScheduler: Missing parents: List()
19/11/14 01:19:29 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75), which has no missing parents
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 30.9 KB, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:39845 (size: 12.4 KB, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:29 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:29 INFO TaskSchedulerImpl: Adding task set 5.0 with 4 tasks
19/11/14 01:19:29 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 20, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 1.0 in stage 5.0 (TID 21, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 2.0 in stage 5.0 (TID 22, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 3.0 in stage 5.0 (TID 23, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/14 01:19:29 INFO Executor: Running task 1.0 in stage 5.0 (TID 21)
19/11/14 01:19:29 INFO Executor: Running task 3.0 in stage 5.0 (TID 23)
19/11/14 01:19:29 INFO Executor: Running task 0.0 in stage 5.0 (TID 20)
19/11/14 01:19:29 INFO Executor: Running task 2.0 in stage 5.0 (TID 22)
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_0 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_1 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_3 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_2 locally
19/11/14 01:19:29 INFO Executor: Finished task 1.0 in stage 5.0 (TID 21). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 1.0 in stage 5.0 (TID 21) in 35 ms on localhost (executor driver) (1/4)
19/11/14 01:19:29 INFO Executor: Finished task 2.0 in stage 5.0 (TID 22). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 2.0 in stage 5.0 (TID 22) in 43 ms on localhost (executor driver) (2/4)
19/11/14 01:19:29 INFO Executor: Finished task 0.0 in stage 5.0 (TID 20). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 20) in 53 ms on localhost (executor driver) (3/4)
WARNING:root:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/14 01:19:40 INFO Executor: Finished task 3.0 in stage 5.0 (TID 23). 10125 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 3.0 in stage 5.0 (TID 23) in 11028 ms on localhost (executor driver) (4/4)
19/11/14 01:19:40 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
19/11/14 01:19:40 INFO DAGScheduler: ResultStage 5 (collect at BoundedDataset.java:76) finished in 11.035 s
19/11/14 01:19:40 INFO DAGScheduler: Job 2 finished: collect at BoundedDataset.java:76, took 11.040103 s
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 176.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 702.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:39845 (size: 702.0 B, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 10 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 980.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_11_piece0 in memory on localhost:39845 (size: 980.0 B, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 11 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:40 INFO SparkPipelineRunner: Job BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f: Pipeline translated successfully. Computing outputs
19/11/14 01:19:40 INFO SparkContext: Starting job: foreach at BoundedDataset.java:124
19/11/14 01:19:40 INFO DAGScheduler: Got job 3 (foreach at BoundedDataset.java:124) with 4 output partitions
19/11/14 01:19:40 INFO DAGScheduler: Final stage: ResultStage 6 (foreach at BoundedDataset.java:124)
19/11/14 01:19:40 INFO DAGScheduler: Parents of final stage: List()
19/11/14 01:19:40 INFO DAGScheduler: Missing parents: List()
19/11/14 01:19:40 INFO DAGScheduler: Submitting ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 32.8 KB, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 12.9 KB, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_12_piece0 in memory on localhost:39845 (size: 12.9 KB, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 12 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:40 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:40 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks
19/11/14 01:19:40 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 24, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 25, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 26, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 27, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/14 01:19:40 INFO Executor: Running task 0.0 in stage 6.0 (TID 24)
19/11/14 01:19:40 INFO Executor: Running task 2.0 in stage 6.0 (TID 26)
19/11/14 01:19:40 INFO Executor: Running task 3.0 in stage 6.0 (TID 27)
19/11/14 01:19:40 INFO Executor: Running task 1.0 in stage 6.0 (TID 25)
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_2 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_0 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_3 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_1 locally
19/11/14 01:19:40 INFO Executor: Finished task 0.0 in stage 6.0 (TID 24). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 0.0 in stage 6.0 (TID 24) in 39 ms on localhost (executor driver) (1/4)
19/11/14 01:19:40 INFO Executor: Finished task 1.0 in stage 6.0 (TID 25). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 1.0 in stage 6.0 (TID 25) in 53 ms on localhost (executor driver) (2/4)
19/11/14 01:19:40 INFO Executor: Finished task 2.0 in stage 6.0 (TID 26). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 2.0 in stage 6.0 (TID 26) in 58 ms on localhost (executor driver) (3/4)
INFO:root:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:root:Renamed 4 shards in 0.10 seconds.
19/11/14 01:19:51 INFO Executor: Finished task 3.0 in stage 6.0 (TID 27). 9384 bytes result sent to driver
19/11/14 01:19:51 INFO TaskSetManager: Finished task 3.0 in stage 6.0 (TID 27) in 11154 ms on localhost (executor driver) (4/4)
19/11/14 01:19:51 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool 
19/11/14 01:19:51 INFO DAGScheduler: ResultStage 6 (foreach at BoundedDataset.java:124) finished in 11.163 s
19/11/14 01:19:51 INFO DAGScheduler: Job 3 finished: foreach at BoundedDataset.java:124, took 11.166911 s
19/11/14 01:19:51 INFO SparkPipelineRunner: Job BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f finished.
19/11/14 01:19:51 INFO SparkUI: Stopped Spark web UI at http://localhost:4040
19/11/14 01:19:51 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/14 01:19:51 INFO MemoryStore: MemoryStore cleared
19/11/14 01:19:51 INFO BlockManager: BlockManager stopped
19/11/14 01:19:51 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/14 01:19:51 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/14 01:19:51 INFO SparkContext: Successfully stopped SparkContext
19/11/14 01:19:51 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/14 01:19:51 INFO AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp6bkwvgdy/artifacts68a0sk7c/job_e864a608-ff83-4b7a-96b0-53b39cd9be88/MANIFEST has 1 artifact locations
19/11/14 01:19:51 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp6bkwvgdy/artifacts68a0sk7c/job_e864a608-ff83-4b7a-96b0-53b39cd9be88/
INFO:root:Job state changed to DONE
19/11/14 01:19:51 INFO InMemoryJobService: Getting job metrics for BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f
19/11/14 01:19:51 INFO InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f
19/11/14 01:19:51 INFO ShutdownHookManager: Shutdown hook called
19/11/14 01:19:51 INFO ShutdownHookManager: Deleting directory /tmp/spark-b8df6111-ae9d-4fc3-b3c7-81e79e897a2d
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 607, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004834822","description":"Error received from peer ipv4:127.0.0.1:40381","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 148, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004844614","description":"Error received from peer ipv4:127.0.0.1:46427","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:root:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004795874","description":"Error received from peer ipv4:127.0.0.1:41341","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004795874","description":"Error received from peer ipv4:127.0.0.1:41341","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 12s
82 actionable tasks: 62 executed, 20 from cache

Publishing build scan...
https://gradle.com/s/bgl22uhh5pvpm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org