You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/05/03 17:59:57 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #2099

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2099/display/redirect?page=changes>

Changes:

[klk] Use LinkedHashMap for step contexts in BaseExecutionContext

------------------------------------------
[...truncated 508.52 KB...]
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/coders/typecoders.py>:132: UserWarning: Using fallback coder for typehint: List[Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/coders/typecoders.py>:132: UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-35.0.2.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging appdirs pyparsing
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
Ran 15 tests in 1533.764s

OK
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_19_40-1754690485454628612?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_26_09-6280341290467304383?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_32_19-9139535620831442120?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_38_24-2007546638189180577?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_19_39-18414103733113071672?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_26_14-15995336248747324315?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_32_53-732622911349438770?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_38_39-16520079622423585638?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_19_44-1110405065218167518?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_25_51-2827855989703016182?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_32_51-9134080085331529102?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_38_26-3394490629282706824?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_19_38-11882203582383710879?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_25_58-12204816031747231236?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_32_53-12742018046237488743?project=apache-beam-testing

# Run integration tests on the Google Cloud Dataflow service
# and validate that jobs finish successfully.
echo ">>> RUNNING TEST DATAFLOW RUNNER it tests"
>>> RUNNING TEST DATAFLOW RUNNER it tests
python setup.py nosetests \
  --attr IT \
  --nocapture \
  --processes=4 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --staging_location=$GCS_LOCATION/staging-it \
    --temp_location=$GCS_LOCATION/temp-it \
    --output=$GCS_LOCATION/py-it-cloud/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1 \
    --sleep_secs=20"
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/local/lib/python2.7/site-packages/setuptools/dist.py>:334: UserWarning: Normalizing '0.7.0.dev' to '0.7.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/coders/typecoders.py>:132: UserWarning: Using fallback coder for typehint: Any.
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/coders/typecoders.py>:132: UserWarning: Using fallback coder for typehint: Dict[Any, Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/coders/typecoders.py>:132: UserWarning: Using fallback coder for typehint: Any.
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/io/gcp/gcsio.py>:110: DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_metrics.py>:109: UserWarning: Distribution metrics will be ignored in the MetricsResult.querymethod. You can see them in the Dataflow User Interface.
  warn('Distribution metrics will be ignored in the MetricsResult.query'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_44_57-16736681653515203099?project=apache-beam-testing
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ERROR

======================================================================
ERROR: test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/nose-1.3.7-py2.7.egg/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/nose-1.3.7-py2.7.egg/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/nose-1.3.7-py2.7.egg/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/nose-1.3.7-py2.7.egg/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py",> line 61, in test_bigquery_tornadoes_it
    test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py",> line 94, in run
    p.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 173, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 44, in run
    self.result.wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 769, in wait_until_finish
    time.sleep(5.0)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/nose-1.3.7-py2.7.egg/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)'

----------------------------------------------------------------------
Ran 2 tests in 900.841s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/job/2017-05-03_10_44_57-12879509101884102764?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user klk@google.com

Jenkins build is back to normal : beam_PostCommit_Python_Verify #2100

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2100/display/redirect?page=changes>