You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/05/14 18:56:02 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #2422

See <https://builds.apache.org/job/beam_PostCommit_Python2/2422/display/redirect?page=changes>

Changes:

[github] [BEAM-9634] Add natural language analysis transform (#11611)


------------------------------------------
[...truncated 9.02 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:45:46.282Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:45:46.318Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:48:51.027Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:48:51.087Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:48:51.133Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:48:51.188Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from datastore/Reshuffle/RemoveRandomKeys+read from datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:48:54.216Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:05.259Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from datastore/Reshuffle/RemoveRandomKeys+read from datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:05.314Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:05.358Z: JOB_MESSAGE_BASIC: Finished operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:05.427Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.141Z: JOB_MESSAGE_BASIC: Finished operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.269Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/UnKey.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.333Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.366Z: JOB_MESSAGE_BASIC: Finished operation Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.418Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:17.469Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:22.173Z: JOB_MESSAGE_BASIC: Finished operation Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:22.235Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:22.282Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:22.335Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:30.789Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:30.845Z: JOB_MESSAGE_DEBUG: Executing success step success49
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:30.946Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:31Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:49:31.025Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:51:13.804Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:51:13.850Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-14T18:51:13.884Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-05-14_11_43_48-11141188036276706352 is in state JOB_STATE_DONE
apache_beam.io.gcp.datastore.v1new.datastore_write_it_pipeline: INFO: Deleting entities.
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0514182746-212909.1589480866.213032/beamapp-jenkins-0514182746-212909.1589481325.609669/beamapp-jenkins-0514182746-212909.1589481812.332504/beamapp-jenkins-0514182746-212909.1589482291.245877/pipeline.pb...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0514182746-212909.1589480866.213032/beamapp-jenkins-0514182746-212909.1589481325.609669/beamapp-jenkins-0514182746-212909.1589481812.332504/beamapp-jenkins-0514182746-212909.1589482291.245877/pipeline.pb in 0 seconds.
apache_beam.runners.portability.stager: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",> line 215, in test_streaming_with_attributes
    self._test_streaming(with_attributes=True)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",> line 207, in _test_streaming
    timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",> line 95, in run_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 525, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 581, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 635, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 691, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 588, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 349, in create_and_stage_job_resources
    options, temp_dir, build_setup_args, populate_requirements_cache)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 172, in create_job_resources
    setup_options.requirements_file, requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 558, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 99, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 574, in check_output
    raise CalledProcessError(retcode, cmd, output=output)
CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 2
 
 Pip install failed for package: -r           
 Output from execution of subprocess: DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting pyhamcrest!=1.10.0,<2.0.0
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz
Collecting mock<3.0.0
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
ERROR: Exception:
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/cli/base_command.py",> line 188, in _main
    status = self.run(options, args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/cli/req_command.py",> line 185, in wrapper
    return func(self, options, args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/commands/download.py",> line 132, in run
    reqs, check_supported_wheels=True
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/resolution/legacy/resolver.py",> line 179, in resolve
    discovered_reqs.extend(self._resolve_one(requirement_set, req))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/resolution/legacy/resolver.py",> line 362, in _resolve_one
    abstract_dist = self._get_abstract_dist_for(req_to_install)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/resolution/legacy/resolver.py",> line 314, in _get_abstract_dist_for
    abstract_dist = self.preparer.prepare_linked_requirement(req)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/operations/prepare.py",> line 412, in prepare_linked_requirement
    hashes=hashes,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/operations/prepare.py",> line 203, in unpack_url
    unpack_file(file.path, location, file.content_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/utils/unpacking.py",> line 261, in unpack_file
    untar_file(filename, location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pip/_internal/utils/unpacking.py",> line 223, in untar_file
    shutil.copyfileobj(fp, destfp)
IOError: [Errno 28] No space left on device

-------------------- >> begin captured logging << --------------------
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: Monitor is waking up
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: gRPC Publish took 0.136137008667 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0514185348-016186.1589482428.016311/pipeline.pb...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0514185348-016186.1589482428.016311/pipeline.pb in 0 seconds.
apache_beam.runners.portability.stager: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 2477.093s

FAILED (SKIP=7, errors=20)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_48-13862602841241621407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_30_26-15363677441138207500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_38_28-7979371074472217784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_41-11269545644327351072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_37_29-10681175934869251701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_46_37-10253386108879935145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_41-17311912248025253725?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_33_42-13205028851703756280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_41_56-6080149533385213596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_42-13498438487682886973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_23_55-10240068618842867540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_32_05-12024937578411228212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_40_13-6580625292468005938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_40-277014607780517090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_23_17-6168537172520452681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_31_53-6851774859741444224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_41_14-18189282470931833577?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_44-15901118601473615724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_28_02-580863963993539631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_35_44-9778745505804383536?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_43_48-11141188036276706352?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_41-2878288476506136294?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_24_30-12450384881304227508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_33_09-15707127783772901401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_41_18-3525121615200178762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_15_42-678059003263264095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_25_51-12302090598113388378?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_36_47-9131745694085631240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_11_44_42-1896663366024349634?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './sdks/java/container/license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:py2:docker'.
> Process 'command 'docker'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 81

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 50

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 44s
119 actionable tasks: 93 executed, 24 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hmwq2zjqq6noq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #2423

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/2423/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org