You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/19 07:01:44 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #253

See <https://builds.apache.org/job/beam_PostCommit_Python37/253/display/redirect>

------------------------------------------
[...truncated 268.45 KB...]
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 23.704s

OK (SKIP=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ERROR
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

======================================================================
ERROR: test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/streaming_wordcount_it_test.py",> line 106, in test_streaming_wordcount_it
    self.test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/streaming_wordcount.py",> line 101, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 484, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 530, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 560, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 490, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.7/subprocess.py", line 395, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.7/subprocess.py", line 487, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1.
 
 Pip install failed for package: -r         
 Output from execution of subprocess: b'Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))\n  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz\nCollecting mock (from -r postcommit_requirements.txt (line 2))\n  File was already downloaded /tmp/dataflow-requirements-cache/mock-3.0.5.tar.gz\nCollecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))\n  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-41.1.0.zip\nCollecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))\n  ERROR: Could not find a version that satisfies the requirement six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) (from versions: none)\nERROR: No matching distribution found for six (from pyhamcrest->-r postcommit_requirements.txt (line 1))\n'
-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: DEBUG: Injecting 500 numbers to topic projects/apache-beam-testing/topics/wc_topic_input8b5ef13e-fa78-4015-ad01-818884caa6b9
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: Monitor is waking up
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: gRPC Publish took 0.23137354850769043 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0819060329-882449.1566194609.882606/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0819060329-882449.1566194609.882606/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0819060329-882449.1566194609.882606/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0819060329-882449.1566194609.882606/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_39-6362846126131247669?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_18_24-6032632817155818387?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:572: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_27_49-3070249808915643784?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_36-9120599967296963226?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_26_05-8163879239847091331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_35_37-14756420142360240778?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_44_55-2834616019957829931?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_37-17035896017625173365?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_17_20-13178134069005038967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_26_28-13881534583005348722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_36_54-3964935352019371216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_36-1544013652776404201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_26_53-9957247860632449854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_35_09-6863450680313945029?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_36-3298173319567146475?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_13_00-12423985121921844951?project=apache-beam-testing.
  experiments = p.options.view_as(DebugOptions).experiments or []
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_24_17-13570049842921240452?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_34_55-15903143980200248173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_44_07-12760906830810191249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_36-5895050446387054533?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_12_33-6332569852680694373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_21_59-9034550965584670006?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_32_21-5231047996282840155?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_41_37-11391407452917859546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_50_39-11000496943121445234?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_43-9866658262991142279?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_12_55-11878460804429177990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_21_58-390095218271761769?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_31_35-14011256251591017073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_40_28-8671575763281418862?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_03_37-7549343271792614373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_14_25-4610374174567806488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_23_52-3336019200783600651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-18_23_34_48-7288066061124426112?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3497.853s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 9s
65 actionable tasks: 48 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/ryg7je4s3mi7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #254

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python37/254/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7882] Invoke Spark API incompatible methods by reflection

------------------------------------------
[...truncated 216.66 KB...]
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSImage: Planning to load image: FSImageFile(file=/hadoop/dfs/name/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSImageFormatPBINode: Loading 1 INodes.
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSImage: Loaded image for txid 0 from /hadoop/dfs/name/current/fsimage_0000000000000000000
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
namenode_1  | 19/08/19 08:12:26 INFO namenode.FSEditLog: Starting log segment at 1
namenode_1  | 19/08/19 08:12:27 INFO namenode.NameCache: initialized with 0 entries 0 lookups
namenode_1  | 19/08/19 08:12:27 INFO namenode.FSNamesystem: Finished loading FSImage in 353 msecs
datanode_1  | 19/08/19 08:12:27 INFO ipc.Client: Retrying connect to server: namenode/172.25.0.2:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
namenode_1  | 19/08/19 08:12:27 INFO namenode.NameNode: RPC server is binding to 0.0.0.0:8020
namenode_1  | 19/08/19 08:12:27 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
namenode_1  | 19/08/19 08:12:27 INFO ipc.Server: Starting Socket Reader #1 for port 8020
namenode_1  | 19/08/19 08:12:27 INFO namenode.FSNamesystem: Registered FSNamesystemState MBean
namenode_1  | 19/08/19 08:12:27 INFO namenode.LeaseManager: Number of blocks under construction: 0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: initializing replication queues
namenode_1  | 19/08/19 08:12:27 INFO hdfs.StateChange: STATE* Leaving safe mode after 0 secs
namenode_1  | 19/08/19 08:12:27 INFO hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
namenode_1  | 19/08/19 08:12:27 INFO hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: Total number of blocks            = 0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: Number of invalid blocks          = 0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: Number of under-replicated blocks = 0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: Number of  over-replicated blocks = 0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.BlockManager: Number of blocks being written    = 0
namenode_1  | 19/08/19 08:12:27 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 12 msec
namenode_1  | 19/08/19 08:12:27 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 19/08/19 08:12:27 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 19/08/19 08:12:27 INFO namenode.NameNode: NameNode RPC up at: namenode/172.25.0.2:8020
namenode_1  | 19/08/19 08:12:27 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 19/08/19 08:12:27 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 19/08/19 08:12:27 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 19/08/19 08:12:27 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
datanode_1  | 19/08/19 08:12:28 INFO ipc.Client: Retrying connect to server: namenode/172.25.0.2:8020. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to namenode/172.25.0.2:8020
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1)
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Lock on /hadoop/dfs/data/in_use.lock acquired by nodename 82@datanode
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Storage directory /hadoop/dfs/data is not formatted for namespace 1418625448. Formatting...
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Generated new storageID DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038 for directory /hadoop/dfs/data
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Analyzing storage directories for bpid BP-646680128-172.25.0.2-1566202344544
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Locking is disabled for /hadoop/dfs/data/current/BP-646680128-172.25.0.2-1566202344544
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Block pool storage directory /hadoop/dfs/data/current/BP-646680128-172.25.0.2-1566202344544 is not formatted for BP-646680128-172.25.0.2-1566202344544. Formatting ...
datanode_1  | 19/08/19 08:12:28 INFO common.Storage: Formatting block pool BP-646680128-172.25.0.2-1566202344544 directory /hadoop/dfs/data/current/BP-646680128-172.25.0.2-1566202344544/current
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Setting up storage: nsid=1418625448;bpid=BP-646680128-172.25.0.2-1566202344544;lv=-57;nsInfo=lv=-63;cid=CID-89ba773f-c9f0-4b53-836f-c53e8620da1e;nsid=1418625448;c=1566202344544;bpid=BP-646680128-172.25.0.2-1566202344544;dnuuid=null
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Generated and persisted new Datanode UUID 103d9f99-03df-4094-b6d7-c983b6e0d428
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Added new volume: DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Added volume - /hadoop/dfs/data/current, StorageType: DISK
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Volume reference is released.
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Adding block pool BP-646680128-172.25.0.2-1566202344544
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Scanning block pool BP-646680128-172.25.0.2-1566202344544 on volume /hadoop/dfs/data/current...
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Time taken to scan block pool BP-646680128-172.25.0.2-1566202344544 on /hadoop/dfs/data/current: 32ms
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-646680128-172.25.0.2-1566202344544: 34ms
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Adding replicas to map for block pool BP-646680128-172.25.0.2-1566202344544 on volume /hadoop/dfs/data/current...
datanode_1  | 19/08/19 08:12:28 INFO impl.BlockPoolSlice: Replica Cache file: /hadoop/dfs/data/current/BP-646680128-172.25.0.2-1566202344544/current/replicas doesn't exist 
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Time to add replicas to map for block pool BP-646680128-172.25.0.2-1566202344544 on volume /hadoop/dfs/data/current: 1ms
datanode_1  | 19/08/19 08:12:28 INFO impl.FsDatasetImpl: Total time to add all replicas to map: 2ms
datanode_1  | 19/08/19 08:12:28 INFO datanode.VolumeScanner: Now scanning bpid BP-646680128-172.25.0.2-1566202344544 on volume /hadoop/dfs/data
datanode_1  | 19/08/19 08:12:28 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 8/19/19 9:20 AM with interval of 21600000ms
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Block pool BP-646680128-172.25.0.2-1566202344544 (Datanode Uuid 103d9f99-03df-4094-b6d7-c983b6e0d428) service to namenode/172.25.0.2:8020 beginning handshake with NN
datanode_1  | 19/08/19 08:12:28 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038): finished scanning block pool BP-646680128-172.25.0.2-1566202344544
datanode_1  | 19/08/19 08:12:28 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038): no suitable block pools found to scan.  Waiting 1814399941 ms.
namenode_1  | 19/08/19 08:12:28 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.25.0.3:50010, datanodeUuid=103d9f99-03df-4094-b6d7-c983b6e0d428, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-89ba773f-c9f0-4b53-836f-c53e8620da1e;nsid=1418625448;c=1566202344544) storage 103d9f99-03df-4094-b6d7-c983b6e0d428
namenode_1  | 19/08/19 08:12:28 INFO net.NetworkTopology: Adding a new node: /default-rack/172.25.0.3:50010
namenode_1  | 19/08/19 08:12:28 INFO blockmanagement.BlockReportLeaseManager: Registered DN 103d9f99-03df-4094-b6d7-c983b6e0d428 (172.25.0.3:50010).
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Block pool Block pool BP-646680128-172.25.0.2-1566202344544 (Datanode Uuid 103d9f99-03df-4094-b6d7-c983b6e0d428) service to namenode/172.25.0.2:8020 successfully registered with NN
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: For namenode namenode/172.25.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/08/19 08:12:28 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038 for DN 172.25.0.3:50010
namenode_1  | 19/08/19 08:12:28 INFO BlockStateChange: BLOCK* processReport 0x28a727489c94aaa1: Processing first storage report for DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038 from datanode 103d9f99-03df-4094-b6d7-c983b6e0d428
namenode_1  | 19/08/19 08:12:28 INFO BlockStateChange: BLOCK* processReport 0x28a727489c94aaa1: from storage DS-2231bedb-b4f2-4fed-a7a5-eb471f5c1038 node DatanodeRegistration(172.25.0.3:50010, datanodeUuid=103d9f99-03df-4094-b6d7-c983b6e0d428, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-89ba773f-c9f0-4b53-836f-c53e8620da1e;nsid=1418625448;c=1566202344544), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Successfully sent block report 0x28a727489c94aaa1,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 57 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/08/19 08:12:28 INFO datanode.DataNode: Got finalize command for block pool BP-646680128-172.25.0.2-1566202344544
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Aug 19, 2019 8:13:12 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Aug 19, 2019 8:13:13 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Aug 19, 2019 8:13:13 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Aug 19, 2019 8:13:13 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Aug 19, 2019 8:13:14 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/08/19 08:13:14 INFO datanode.webhdfs: 172.25.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/08/19 08:13:14 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.25.0.3:50010 for /kinglear.txt
datanode_1  | 19/08/19 08:13:15 INFO datanode.DataNode: Receiving BP-646680128-172.25.0.2-1566202344544:blk_1073741825_1001 src: /172.25.0.3:52520 dest: /172.25.0.3:50010
datanode_1  | 19/08/19 08:13:15 INFO DataNode.clienttrace: src: /172.25.0.3:52520, dest: /172.25.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1666803247_67, offset: 0, srvID: 103d9f99-03df-4094-b6d7-c983b6e0d428, blockid: BP-646680128-172.25.0.2-1566202344544:blk_1073741825_1001, duration: 17576686
datanode_1  | 19/08/19 08:13:15 INFO datanode.DataNode: PacketResponder: BP-646680128-172.25.0.2-1566202344544:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/08/19 08:13:15 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/08/19 08:13:15 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/08/19 08:13:15 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1666803247_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | /usr/local/lib/python3.7/site-packages/apache_beam/__init__.py:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
test_1      |   'Some syntactic constructs of Python 3 are not yet fully supported by '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7efea6674170> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7efea6674290> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7efea6674320> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7efea66743b0> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7efea6674440> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7efea6674560> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7efea66745f0> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7efea6674680> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7efea6674710> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7efea66748c0> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7efea6674950> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7efea66749e0> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (((ref_AppliedPTransform_read/Read_3)+(ref_AppliedPTransform_split_4))+(ref_AppliedPTransform_pair_with_one_5))+(group/Write)
datanode_1  | 19/08/19 08:13:18 INFO datanode.webhdfs: 172.25.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running ((((((group/Read)+(ref_AppliedPTransform_count_10))+(ref_AppliedPTransform_format_11))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+(ref_AppliedPTransform_write/Write/WriteImpl/Pair_19))+(ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20))+(write/Write/WriteImpl/GroupByKey/Write)
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/08/19 08:13:21 INFO datanode.webhdfs: 172.25.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-33b7f752c25911e98ab80242ac190004/a2fe14bb-7b72-4a39-8fbc-5a08f7520bc3.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/08/19 08:13:21 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.25.0.3:50010 for /beam-temp-py-wordcount-integration-33b7f752c25911e98ab80242ac190004/a2fe14bb-7b72-4a39-8fbc-5a08f7520bc3.py-wordcount-integration
datanode_1  | 19/08/19 08:13:21 INFO datanode.DataNode: Receiving BP-646680128-172.25.0.2-1566202344544:blk_1073741826_1002 src: /172.25.0.3:52566 dest: /172.25.0.3:50010
datanode_1  | 19/08/19 08:13:21 INFO DataNode.clienttrace: src: /172.25.0.3:52566, dest: /172.25.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-967481781_69, offset: 0, srvID: 103d9f99-03df-4094-b6d7-c983b6e0d428, blockid: BP-646680128-172.25.0.2-1566202344544:blk_1073741826_1002, duration: 4597042
datanode_1  | 19/08/19 08:13:21 INFO datanode.DataNode: PacketResponder: BP-646680128-172.25.0.2-1566202344544:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/08/19 08:13:21 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-33b7f752c25911e98ab80242ac190004/a2fe14bb-7b72-4a39-8fbc-5a08f7520bc3.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-967481781_69
test_1      | INFO:root:Running ((write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/Extract_25))+(ref_PCollection_PCollection_17/Write)
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.16 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python37-254_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python37-254_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-254_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-254_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python37-254_namenode_1 ... done
Aborting on container exit...

real	1m33.337s
user	0m1.240s
sys	0m0.138s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-254 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python37-254_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-254_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-254_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-254_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-254_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-254_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python37-254_test_net

real	0m0.800s
user	0m0.615s
sys	0m0.107s

> Task :sdks:python:test-suites:direct:py37:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 23.850s

OK (SKIP=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 30s
63 actionable tasks: 46 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/v4akdr7nnm7g4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org