You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/21 07:02:33 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #1502

See <https://builds.apache.org/job/beam_PostCommit_Python2/1502/display/redirect>

Changes:


------------------------------------------
[...truncated 3.60 MB...]
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-0121060840-23472513_406608b7-39d4-42f6-b5f3-b738842f9c3f
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-0121060840-23472513_406608b7-39d4-42f6-b5f3-b738842f9c3f
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f79594972a8> ====================
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/runners/spark/job-server/build/libs/beam-runners-spark-job-server-2.20.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp9gguDb/artifactsShAjRJ' '--job-port' '52947' '--artifact-port' '0' '--expansion-port' '0']
20/01/21 06:09:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43973
20/01/21 06:09:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:38467
20/01/21 06:09:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:52947
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/21 06:09:04 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d
20/01/21 06:09:04 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
20/01/21 06:09:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
20/01/21 06:09:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
20/01/21 06:09:05 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
20/01/21 06:09:06 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/01/21 06:09:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d on Spark master local[4]
20/01/21 06:09:06 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
20/01/21 06:09:06 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37729.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
20/01/21 06:09:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:42943.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35585
20/01/21 06:09:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
20/01/21 06:09:09 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
20/01/21 06:09:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
20/01/21 06:09:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d finished.
20/01/21 06:09:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
20/01/21 06:09:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp9gguDb/artifactsShAjRJ/job_2b129af0-11b2-43c1-87ef-7e3ce47029be/MANIFEST has 1 artifact locations
20/01/21 06:09:56 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp9gguDb/artifactsShAjRJ/job_2b129af0-11b2-43c1-87ef-7e3ce47029be/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
20/01/21 06:09:56 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d
20/01/21 06:09:56 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-0121060904-4a1d2e6_4f6400e9-512d-4b47-b5cb-959341a2c28d
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 423, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1579586996.526101341","description":"Error received from peer ipv4:127.0.0.1:35585","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 438, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 423, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1579586996.526101341","description":"Error received from peer ipv4:127.0.0.1:35585","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 137, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1579586996.526589017","description":"Error received from peer ipv4:127.0.0.1:37729","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 649, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1579586996.526117886","description":"Error received from peer ipv4:127.0.0.1:42943","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:742: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:75: FutureWarning: _ReadFromBigQuery is experimental.
  known_args.input_table))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:259: FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155: FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:757: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1418: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:742: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:298: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:75: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 51 tests in 3627.738s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 50

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 8s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/qqergc7ae2m3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #1504

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1504/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1503

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1503/display/redirect>

Changes:


------------------------------------------
[...truncated 146.43 KB...]
namenode_1  | 20/01/21 12:02:27 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 13 msec
namenode_1  | 20/01/21 12:02:27 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 20/01/21 12:02:27 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 20/01/21 12:02:27 INFO namenode.NameNode: NameNode RPC up at: namenode/172.28.0.2:8020
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 20/01/21 12:02:27 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
namenode_1  | 20/01/21 12:02:29 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.28.0.3:50010, datanodeUuid=1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0a27be3-8085-4e1d-bb41-26bafd15751a;nsid=1369923457;c=1579608144274) storage 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f
namenode_1  | 20/01/21 12:02:29 INFO net.NetworkTopology: Adding a new node: /default-rack/172.28.0.3:50010
namenode_1  | 20/01/21 12:02:29 INFO blockmanagement.BlockReportLeaseManager: Registered DN 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f (172.28.0.3:50010).
namenode_1  | 20/01/21 12:02:29 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 for DN 172.28.0.3:50010
namenode_1  | 20/01/21 12:02:29 INFO BlockStateChange: BLOCK* processReport 0xa44b76ab02811915: Processing first storage report for DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 from datanode 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f
namenode_1  | 20/01/21 12:02:29 INFO BlockStateChange: BLOCK* processReport 0xa44b76ab02811915: from storage DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 node DatanodeRegistration(172.28.0.3:50010, datanodeUuid=1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0a27be3-8085-4e1d-bb41-26bafd15751a;nsid=1369923457;c=1579608144274), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 20/01/21 12:02:29 INFO datanode.DataNode: Successfully sent block report 0xa44b76ab02811915,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 54 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 20/01/21 12:02:29 INFO datanode.DataNode: Got finalize command for block pool BP-1078267749-172.28.0.2-1579608144274
test_1      | GLOB sdist-make: /app/sdks/python/setup.py

> Task :runners:flink:1.9:job-server-container:docker
 ---> 8d45f2b857a0
Step 5/7 : ADD flink-job-server.sh /opt/apache/beam/
 ---> 6eda8cc01f25
Step 6/7 : WORKDIR /opt/apache/beam
 ---> Running in 5423027b7dd9
Removing intermediate container 5423027b7dd9
 ---> 5a28b445832b
Step 7/7 : ENTRYPOINT ["./flink-job-server.sh"]
 ---> Running in 167f4d68ee85
Removing intermediate container 167f4d68ee85
 ---> 250dafef88a2
Successfully built 250dafef88a2
Successfully tagged apachebeam/flink1.9_job_server:latest

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test create: /app/sdks/python/target/.tox/hdfs_integration_test
test_1      | hdfs_integration_test installdeps: -rbuild-requirements.txt, gsutil==4.47, holdup==1.8.0
test_1      | hdfs_integration_test inst: /app/sdks/python/target/.tox/.tmp/package/1/apache-beam-2.20.0.dev0.zip

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test installed: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support,apache-beam==2.20.0.dev0,argcomplete==1.11.1,avro==1.9.1,boto==2.49.0,cachetools==3.1.1,certifi==2019.11.28,cffi==1.13.2,chardet==3.0.4,configparser==4.0.2,contextlib2==0.6.0.post1,crcmod==1.7,cryptography==2.8,dill==0.3.1.1,docopt==0.6.2,enum34==1.1.6,fastavro==0.21.24,fasteners==0.15,funcsigs==1.0.2,future==0.16.0,futures==3.3.0,gcs-oauth2-boto-plugin==2.5,google-api-core==1.16.0,google-apitools==0.5.28,google-auth==1.10.1,google-cloud-bigquery==1.17.1,google-cloud-bigtable==1.0.0,google-cloud-core==1.2.0,google-cloud-datastore==1.7.4,google-cloud-pubsub==1.0.2,google-cloud-spanner==1.13.0,google-reauth==0.1.0,google-resumable-media==0.4.1,googleapis-common-protos==1.51.0,googledatastore==7.0.2,grpc-google-iam-v1==0.12.3,grpcio==1.26.0,grpcio-gcp==0.2.2,grpcio-tools==1.14.2,gsutil==4.47,hdfs==2.5.8,holdup==1.8.0,httplib2==0.12.0,idna==2.8,importlib-metadata==1.4.0,ipaddress==1.0.23,mock==2.0.0,monotonic==1.5,more-itertools==5.0.0,numpy==1.16.6,oauth2client==3.0.0,pathlib2==2.3.5,pbr==5.4.4,proto-google-cloud-datastore-v1==0.90.4,protobuf==3.11.2,pyarrow==0.15.1,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycparser==2.19,pydot==1.4.1,pymongo==3.10.1,pyOpenSSL==19.1.0,pyparsing==2.4.6,python-dateutil==2.8.1,pytz==2019.3,pyu2f==0.1.4,PyVCF==0.6.8,requests==2.22.0,retry-decorator==1.1.0,rsa==4.0,scandir==1.10.0,six==1.14.0,SocksiPy-branch==1.1,typing==3.7.4.1,typing-extensions==3.7.4.1,urllib3==1.25.7,zipp==1.0.0
test_1      | hdfs_integration_test run-test-pre: PYTHONHASHSEED='891269984'
test_1      | hdfs_integration_test run-test: commands[0] | holdup -t 45 http://namenode:50070 http://datanode:50075
test_1      | hdfs_integration_test run-test: commands[1] | echo 'Waiting for safe mode to end.'
test_1      | Waiting for safe mode to end.
test_1      | hdfs_integration_test run-test: commands[2] | sleep 45

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/go>

> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/go>

> Task :sdks:python:container:installDependencies
> Task :sdks:java:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:dockerPrepare
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:py2:copyLauncherDependencies
> Task :sdks:java:container:docker

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test run-test: commands[3] | gsutil cp gs://dataflow-samples/shakespeare/kinglear.txt .
test_1      | Copying gs://dataflow-samples/shakespeare/kinglear.txt...
test_1      | / [0 files][    0.0 B/153.6 KiB]                                                / [1 files][153.6 KiB/153.6 KiB]                                                
test_1      | Operation completed over 1 objects/153.6 KiB.                                    
test_1      | hdfs_integration_test run-test: commands[4] | hdfscli -v -v -v upload -f kinglear.txt /
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Jan 21, 2020 12:04:21 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
namenode_1  | 20/01/21 12:04:23 INFO namenode.FSEditLog: Number of transactions: 2 Total time for transactions(ms): 14 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 157 
datanode_1  | 20/01/21 12:04:23 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 20/01/21 12:04:23 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.28.0.3:50010 for /kinglear.txt
datanode_1  | 20/01/21 12:04:23 INFO datanode.DataNode: Receiving BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001 src: /172.28.0.3:60070 dest: /172.28.0.3:50010
datanode_1  | 20/01/21 12:04:23 INFO DataNode.clienttrace: src: /172.28.0.3:60070, dest: /172.28.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1495979031_67, offset: 0, srvID: 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, blockid: BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001, duration: 16637226
datanode_1  | 20/01/21 12:04:23 INFO datanode.DataNode: PacketResponder: BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/21 12:04:23 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 20/01/21 12:04:23 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 20/01/21 12:04:23 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1495979031_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | hdfs_integration_test run-test: commands[5] | python -m apache_beam.examples.wordcount --input 'hdfs://kinglear*' --output hdfs://py-wordcount-integration --hdfs_host namenode --hdfs_port 50070 --hdfs_user root
test_1      | apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
test_1      |   'You are using Apache Beam with Python 2. '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f0c5ce392d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f0c5ce393d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f0c5ce39450> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f0c5ce394d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f0c5ce39550> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f0c5ce39650> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f0c5ce396d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f0c5ce39750> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f0c5ce397d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f0c5ce39950> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f0c5ce399d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f0c5ce39a50> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f0c5cc8d2d0> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Impulse_19)+((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2597>)_20)+(ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22)))+((ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23)+(ref_PCollection_PCollection_13/Write)))+(ref_PCollection_PCollection_12/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction)+(ref_PCollection_PCollection_1_split/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_PCollection_PCollection_1_split/Read)+((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+((ref_AppliedPTransform_split_7)+(ref_AppliedPTransform_pair_with_one_8))))+(group/Write)
datanode_1  | 20/01/21 12:04:27 INFO datanode.webhdfs: 172.28.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((group/Read)+((ref_AppliedPTransform_count_13)+(ref_AppliedPTransform_format_14)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_25)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 20/01/21 12:04:29 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 20/01/21 12:04:30 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.28.0.3:50010 for /beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration
datanode_1  | 20/01/21 12:04:30 INFO datanode.DataNode: Receiving BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002 src: /172.28.0.3:60290 dest: /172.28.0.3:50010
datanode_1  | 20/01/21 12:04:30 INFO DataNode.clienttrace: src: /172.28.0.3:60290, dest: /172.28.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1340652976_69, offset: 0, srvID: 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, blockid: BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002, duration: 3937078
datanode_1  | 20/01/21 12:04:30 INFO datanode.DataNode: PacketResponder: BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/21 12:04:30 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1340652976_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_31)+(ref_PCollection_PCollection_20/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_12/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32)+(ref_PCollection_PCollection_21/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_12/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
test_1      | hdfs_integration_test run-test-post: commands[0] | /app/sdks/python/scripts/run_tox_cleanup.sh
test_1      | ___________________________________ summary ____________________________________
test_1      |   hdfs_integration_test: commands succeeded
test_1      |   congratulations :)
hdfs_it-jenkins-beam_postcommit_python2-1503_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... done
Aborting on container exit...

real	2m55.583s
user	0m1.360s
sys	0m0.178s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1503 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1503_test_net

real	0m1.140s
user	0m0.611s
sys	0m0.119s

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 0s
105 actionable tasks: 79 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/w7rt2pqit3yk2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org