You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/04/20 16:23:06 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #1934

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1934/display/redirect?page=changes>

Changes:

[tgroh] [BEAM-1948] Defend against absent Aggregators

------------------------------------------
[...truncated 691.83 KB...]
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py>:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py>:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

======================================================================
ERROR: test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/ptransform_test.py",> line 219, in test_undeclared_outputs
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 160, in run
    self.to_runner_api(), self.runner, self.options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 35, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 247, in run
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/utils/retry.py",> line 166, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 425, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 448, in create_job_description
    job.options, file_copy=self._gcs_file_copy)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 306, in stage_job_resources
    setup_options.requirements_file, requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 242, in _populate_requirements_cache
    processes.check_call(cmd_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/utils/processes.py",> line 40, in check_call
    return subprocess.check_call(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/bin/python',> '-m', 'pip', 'install', '--download', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:']' returned non-zero exit status 2
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by Some Numbers/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:213>) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Unkey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:213>) (tag odd): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Unkey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:213>) (tag even): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Unkey (tag None): refcount: 1 => 0
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0420160504-202353.1492704304.202687/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0420160504-202353.1492704304.202687/requirements.txt
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/bin/python',> '-m', 'pip', 'install', '--download', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/ptransform_test.py",> line 190, in test_par_do_with_multiple_outputs_and_using_yield
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 160, in run
    self.to_runner_api(), self.runner, self.options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 35, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 247, in run
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/utils/retry.py",> line 166, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 425, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 448, in create_job_description
    job.options, file_copy=self._gcs_file_copy)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 306, in stage_job_resources
    setup_options.requirements_file, requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 242, in _populate_requirements_cache
    processes.check_call(cmd_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/utils/processes.py",> line 40, in check_call
    return subprocess.check_call(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/bin/python',> '-m', 'pip', 'install', '--download', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:']' returned non-zero exit status 2
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by Some Numbers/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/ParDo(SomeDoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert_that/Unkey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/ParDo(SomeDoFn) (tag odd): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:odd/Unkey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by ClassifyNumbers/ParDo(SomeDoFn) (tag even): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Create/Read (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/ToVoidKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/pair_with_0 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/pair_with_1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/Flatten (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/GroupByKey (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by assert:even/Unkey (tag None): refcount: 1 => 0
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0420160503-860733.1492704303.861022/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0420160503-860733.1492704303.861022/requirements.txt
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/bin/python',> '-m', 'pip', 'install', '--download', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 1083.361s

FAILED (errors=2)
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_05_21-4921794609676322789?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_11_41-10517555660450162259?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_17_23-7320475611797040546?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_05_27-98731489016263257?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_10_46-5071289244291002069?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_16_41-4406947585920097725?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_05_28-8567292128158858050?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_11_08-9283717048514166273?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_16_34-9587575857372064910?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_05_21-565875446987948821?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_11_30-9133189730930856877?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_16_43-2246282137554446181?project=apache-beam-testing
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Python_Verify #1936

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1936/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Python_Verify #1935

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1935/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-662] Fix for allowing floating point periods in windows

[robertwb] Make stage names consistent.

[robertwb] Require deterministic window coders.

[robertwb] Enable IntervalWindowCoder test check.

[robertwb] Remove obsolete and unused Runner.clear

[robertwb] Rename AfterFirst to AfterAny for consistency with Java.

[robertwb] Remove bigshuffle from python examples

[robertwb] Remove vestigial Read and Write from core.py

[aljoscha.krettek] [BEAM-1886] Remove TextIO override in Flink runner

[aljoscha.krettek] Exclude UsesSplittableParDoWithWindowedSideInputs in Flink Stream Runner

------------------------------------------
[...truncated 850.31 KB...]
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s12", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_merge_tagged_vals_under_key"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s11"
        }, 
        "serialized_fn": "<string of 1332 bytes>", 
        "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s13", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s12"
        }, 
        "serialized_fn": "<string of 956 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s14", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1112 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 20 Apr 2017 16:48:52 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(3cd60379f74dbe03): The workflow could not be created. Causes: (66c8265657b52535): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '440', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 20 Apr 2017 16:48:54 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(3526da3a3bbd396): The workflow could not be created. Causes: (802261ba744c1a25): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 20 Apr 2017 16:49:03 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(22c61baeeb4ac6db): The workflow could not be created. Causes: (71bb6e39d741719f): Too many running jobs. Project apache-beam-testing is running 26 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 20 Apr 2017 16:49:13 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(639197c3a7b95033): The workflow could not be created. Causes: (31b3e2093d3627a8): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: ERROR: HTTP status 429 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
root: CRITICAL: details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 20 Apr 2017 16:49:32 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(e02f0d7f7d8f8e8e): The workflow could not be created. Causes: (5fb011768af83075): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 1143.787s

FAILED (errors=2)
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_36_03-17306565410535521102?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_42_25-11836023461597041757?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_48_50-13690095825366945233?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_36_03-12399630952851908062?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_42_45-4692674566775294509?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_36_03-11692027761086874489?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_42_20-4137566846231221883?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_47_02-15845890897779327383?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_36_03-8589621051685956297?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_41_46-12806371989050635570?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-04-20_09_46_58-9673877740759467903?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user aljoscha.krettek@gmail.com