You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/29 18:57:27 UTC

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #545

See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/545/display/redirect>

Changes:


------------------------------------------
[...truncated 157.21 KB...]
        ],
        "non_parallel_inputs": {
          "side0-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s18"
          },
          "side1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s19"
          },
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s20"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/l/G8UVX0nOwRIgJCUlhIJIcbumSBuHJhBDoaAkxFGjuGuDt4V0O9odaTbe3dk3M2vFNOJIasfQE3pB79L7Lr3v4+/oP9M3I9lGYNP8Bp+PtKM53ps33/d931k9W3FCkpOQ0aBNSVpXgmSyw0Uq6yEX1G6QJCHthM4LkudUHOcnMxusieeg1Iey4++yLCvoFLhGdGVOQ6i0/Gv1mFrKacDiTEkYG90AJ8x4PaK4A1FcSHv67BwOn9LDNmxD79tbfdgxcM8LlRfKOJSwc+A+zjaGrmkVK2C3/YoJJYNrR7aLOX7aRFI76MQZSeKnaNATsaI27PJ3oEkueEilhOvYrqJ9Dq533miu6AUVczuYw3Y2zhZsuAED3N2HGx1/HzqQjIgoyEhKg27C24EGjyjYYyZrd9V4pzbuZEUamIVy4tCRCPb6N45aDo3e5d+pjcad4VSR6uWburjJ341LO3FCBx5k0enEF2DfSPg8x9AzWV8kSUEDPOpiHFFhzyqi4vBxPTizNgbvxlPd3If9jm+j44GFRhhuCaM4SeqBftpBwklkxm04YDCXSsCty/AeD27zt61Zwu0+/rSgWrTNYMhxD7hjJDgzJoeNPcdnlYizboObaA5iNO/tw51tE83G2WG8afn7dao1HzMMRZA4Qbsgoz1skRDvWzXQhDzNBWYWARic4/3+GA6TQnFwTOoDRokOa6K1gWVOFEOgqMbyLqYx+UAf7nbYLew2drt/Fpd15ZTrKprmNYncJV1aw+TVMJSa4oOGSiXdfKkWq1qY8CJyBxR2J48cPXbvfYeP4efwURdjKxIsjxpCdA36TeN0CHjdnFnzzs0TEmfgFu1RVmaKCiRzvVBxYj8sukVKMzWTkJAynhj8DmHkk0w/DzdLpr2H9uGD5+CI40/orGHSD9VMJbjzG8/pNE/ck8NCMSNwdGTr3KTXDh7L8jhcSGg0i46mdTnacG8f7nMMs3txFvFekGKOdGqwKo9tpTI6EFPO0jb1Q5LgTdY2TJncRUQRuH+zeNajOI5LbHgAz/uhPjw4yPNiTHs6hodGiBwKShRWXpGFukhs+LDDDvg7cb2moNYjeHgZHvGg4TRLTQu/lebexnWrlnXRslZL1uWyNQvHW8twYsJYIVZCKyWcXIZHfYEjLuMpdc/TbCHO5FpbkwlZpG6PiwWJ56CuPkYwQ4WRgCykc5o+wTwXUYMXmZqeC2aW7jniShG6MlrQxFKMZ+7rQHAHINTzJThlQnkgIWk7Ig/C9JmzpYYFp/2bNL8FTwOBLjXR1qNtGpE1EA3VAj6yAmcmFLQ8ODuCWJeqgCiF7Jox27SLOFF4JPiogRmn9Sx4KzDrwdyIaZzmXKgg5VGRoHI85u/RpfimPMLjyzDvgW/cB2gbqiCAj63Axz14gk23NktgSLEDTzJMH2aqjJmqNHc0G41LCmuoZJ0v63xFJl+XS1Yfu2VLTloXcapiRWVLjVmXLZNNtU3P4aKoYi2ULdHU3WjM2j+rtlvRNmt9Xu1Ym9noVEynU7H2YXMJfVhIj3NOq1k24ES0Q7Dc4ROaS/7TODLT4ElCDfeqvFOVWHDV8ajaixWrpnj1VhUjOJPRKk2oru8qCfVNRaMqkVWCBlk3oQqtdfLq1ZOxkKqqenxtvazSLNQUokLboMeD4/Lg3eZZh0DBJx1TVUksFRCjQZoVivNEQtvfrvsyiUMKoRF6zC9E/vX460Saq6X1kgNqphOaQccol7lYTgjBBXTZrQqYXza+ITZQrNHsvNldazAsXIGEnTI3ZrA5TdMzr5Uau63yzaXtpb2lPaWdpUqpUoZsAonKPcjZEyxt4UsDtBQID+QyKA8K9mQfFrdQpB6bwlUXPFhagaf68Ck0vOhBnx1jU+aQA7T1haDFFp72D2jKI1entGQFRrOm1uKbWpyEZ4p2cQWeXZfXyauS1+cYyuYlR2/L7mdauC734dMOe2hI6IH0lJqzjbGLmlLLqDgrEwxF5srbJTKroyLz/Jn/lthpnYgXPPgMJuJ5nYjPIp6f8+Dz5mD/H7wvGPC+uA7e4asC70UN3kuvB+9LffjyG8HbRLe/gih+1aD4tbcLxZdHUXwFpZqdZk2G6vt1xPIbHnwTsXylxd6Z0vYtLW3sHSNn31bwHYcR1mYhixhlHWak57ssZucZysv32MtvJS+vbi4v39es/oEHP8RMvKpZ/SNk9Y89+Mky/NSDn2l5+fkW8vILIy+/9OBXK/DrPvwGDV/z4LdXWQ6/M+Xwe/qW/83mzaa4nw1/QOr/sQ9/cszFqs8mFUnzAF+C2/hSLODPzZLh2yBQjO8vW7kerLAfNe9igy3wT9lfcYO/mTsC39K7XayBDP6+lYvhEvv44NKbG3bhH+jknwYvzH2RFgnRdNFvBRT+hQGaf46xDNYuy3+vFm0F/6n/DxnV080=",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-29T18:49:47.991947Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-29_11_49_46-11474051589639023489'
 location: 'us-central1'
 name: 'beamapp-jenkins-0929184944-176827'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-29T18:49:47.991947Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-29_11_49_46-11474051589639023489]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-29_11_49_46-11474051589639023489?project=apache-beam-testing
root: INFO: Job 2019-09-29_11_49_46-11474051589639023489 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-29T18:49:50.175Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-29T18:49:50.586Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
root: INFO: 2019-09-29T18:49:51.145Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-29T18:49:51.202Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-29T18:49:51.230Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-29T18:49:51.270Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-29T18:49:51.298Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-29T18:49:51.403Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-29T18:49:51.441Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-29T18:49:51.473Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-29T18:49:51.502Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-29T18:49:51.531Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-29T18:49:51.560Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-29T18:49:51.597Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-29T18:49:51.634Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-29T18:49:51.672Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-29T18:49:51.699Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-29T18:49:51.731Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-29T18:49:51.767Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-29T18:49:51.794Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-29T18:49:51.820Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-29T18:49:51.849Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-29T18:49:51.883Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-29T18:49:51.913Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-29T18:49:51.951Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-29T18:49:51.972Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-29T18:49:52.004Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-29T18:49:52.044Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-29T18:49:52.201Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-29T18:49:52.278Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-29T18:49:52.313Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-29T18:49:52.314Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-29T18:49:52.343Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-f...
root: INFO: 2019-09-29T18:49:52.343Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-29T18:49:52.400Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-29T18:49:52.400Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-29T18:49:52.452Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-29T18:49:52.475Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-29T18:49:52.530Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-29T18:50:21.172Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 6 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-29T18:50:21.200Z: JOB_MESSAGE_DETAILED: Resized worker pool to 6, though goal was 10.  This could be a quota issue.
root: INFO: 2019-09-29T18:50:26.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-29T18:50:48.319Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-29T18:50:48.349Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-29T18:54:52.768Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-29T18:54:54.843Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-29T18:54:57.899Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-29T18:54:59.969Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-29T18:55:02.028Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-29T18:55:02.053Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-29T18:55:02.100Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-29T18:55:02.121Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S06:read/Read+split+pair_with_one+group/Reify+group/Write failed., Internal Issue (2bf87ee7a8c74694): 63963027:24514
root: INFO: 2019-09-29T18:55:02.526Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed.
root: INFO: 2019-09-29T18:55:02.550Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-29T18:55:02.638Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-29T18:55:02.685Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-29T18:55:02.716Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-29T18:57:10.540Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-29T18:57:10.581Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-29T18:57:10.613Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-29_11_49_46-11474051589639023489 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569782982926/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569782982926/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569782982926\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.048928022384643555 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 460.978s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 8m 41s

2019-09-29 18:57:25,969 dca6f07b MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-29 18:57:25,971 dca6f07b MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-29 18:57:25,976 dca6f07b MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-29 18:57:25,976 dca6f07b MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-29 18:57:25,978 dca6f07b MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-29 18:57:25,978 dca6f07b MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/dca6f07b/pkb.log>
2019-09-29 18:57:25,978 dca6f07b MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/dca6f07b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py35 #555

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/555/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #554

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/554/display/redirect?page=changes>

Changes:

[robertwb] Pull out reusable parts of BeamFileSystemArtifact*.java into abstract

[valentyn] Move Flink VR test code into a common file to be shared across multiple

[valentyn] Include common portable runner tasks in python 3.x suites.

[valentyn] Evaluate the name of Python container task based on interpreter version.

[valentyn] Add Python 3.5 ValidatesRunner Flink Jenkins suite.

[valentyn] Match test suite name in the UI with the trigger command.

[valentyn] Update postcommit status links in README.md.

[valentyn] Rename existing Python_PVR_Flink precommit job to Python2_PVR_Flink.

[robertwb] [BEAM-8312] ArtifactRetrievalService serving artifacts from jar

[github] Fix Python RC staging URL in release script


------------------------------------------
[...truncated 181.39 KB...]

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:installGcpTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:installGcpTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && pip install --retries 10 -e <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.17.0.dev0)
Collecting dill<0.3.1,>=0.3.0 (from apache-beam==2.17.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (1.24.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/54/95/bcbe5658d6ac65af35996a80ed66d82c50f9c0b36424f4758cd54dd08d73/pyarrow-0.14.1-cp35-cp35m-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9b/21/2b18339d24a2f73dcefb2f10f48aff6182e16da83e3a612684443c6cfb29/numpy-1.17.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e3/e8/b3212641ee2718d556df0f23f78de8303f068fe29cdaa7a91018849582fe/PyYAML-5.1.2.tar.gz
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Building wheels for collected packages: pyyaml
  Building wheel for pyyaml (setup.py): started
  Building wheel for pyyaml (setup.py): finished with status 'done'
  Created wheel for pyyaml: filename=PyYAML-5.1.2-cp35-cp35m-linux_x86_64.whl size=44103 sha256=24a6e9d193a020a87625a21f9f9f3c031d145fc0c25256da68b7f5322ecb3b6c
  Stored in directory: /home/jenkins/.cache/pip/wheels/d9/45/dd/65f0b38450c47cf7e5312883deb97d065e030c5cca0a365030
Successfully built pyyaml
Installing collected packages: crcmod, dill, fastavro, future, docopt, certifi, urllib3, idna, chardet, requests, hdfs, httplib2, pbr, mock, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro-python3, numpy, pyarrow, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-5.1.2 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.6
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) completed. Took 22.741 secs.
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:integrationTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:integrationTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:integrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:186: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... 
IssueCommand timed out after 1200 seconds.  Process was killed by perfkitbenchmarker.
2019-10-02 01:08:38,352 eace5abd MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-10-02 01:08:38,353 eace5abd MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-10-02 01:08:38,353 eace5abd MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/eace5abd/pkb.log>
2019-10-02 01:08:38,353 eace5abd MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/eace5abd/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #553

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/553/display/redirect?page=changes>

Changes:

[bhulette] Always use UTC when creating instances of baselocal

[robertwb] [BEAM-6896] Loosen PyYAML dependency.


------------------------------------------
[...truncated 181.76 KB...]

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:installGcpTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:installGcpTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && pip install --retries 10 -e <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.17.0.dev0)
Collecting dill<0.3.1,>=0.3.0 (from apache-beam==2.17.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (1.24.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/54/95/bcbe5658d6ac65af35996a80ed66d82c50f9c0b36424f4758cd54dd08d73/pyarrow-0.14.1-cp35-cp35m-manylinux2010_x86_64.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9b/21/2b18339d24a2f73dcefb2f10f48aff6182e16da83e3a612684443c6cfb29/numpy-1.17.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
  Downloading https://files.pythonhosted.org/packages/e3/e8/b3212641ee2718d556df0f23f78de8303f068fe29cdaa7a91018849582fe/PyYAML-5.1.2.tar.gz (265kB)
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Building wheels for collected packages: pyyaml
  Building wheel for pyyaml (setup.py): started
  Building wheel for pyyaml (setup.py): finished with status 'done'
  Created wheel for pyyaml: filename=PyYAML-5.1.2-cp35-cp35m-linux_x86_64.whl size=44103 sha256=e2c41dda9bb7cf37c8f0d3b27cc475631c4fae5b487ef879dff0bfb2a9f5d354
  Stored in directory: /home/jenkins/.cache/pip/wheels/d9/45/dd/65f0b38450c47cf7e5312883deb97d065e030c5cca0a365030
Successfully built pyyaml
Installing collected packages: crcmod, dill, fastavro, future, docopt, certifi, idna, urllib3, chardet, requests, hdfs, httplib2, pbr, mock, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, numpy, pyarrow, avro-python3, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-5.1.2 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.6
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) completed. Took 34.44 secs.
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:integrationTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:integrationTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:integrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:186: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... 
IssueCommand timed out after 1200 seconds.  Process was killed by perfkitbenchmarker.
2019-10-01 19:16:52,382 398dba3b MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-10-01 19:16:52,387 398dba3b MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-10-01 19:16:52,388 398dba3b MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/398dba3b/pkb.log>
2019-10-01 19:16:52,392 398dba3b MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/398dba3b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #552

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/552/display/redirect?page=changes>

Changes:

[mxm] [BEAM-5428] Implement cross-bundle user state caching in the Python SDK

[mxm] [BEAM-5428] Add pipeline option to enable caching / Disable caching by

[ajhalaria] [BEAM-8300] - Add readObject to initialize producer if it is null

[lcwik] [BEAM-8021] Swap build-tools to be compile only so it isn't a "required"


------------------------------------------
[...truncated 179.37 KB...]
Successfully started process 'command 'sh''
Collecting future==0.16.0
Installing collected packages: future
Successfully installed future-0.16.0
Create distribution tar file apache-beam.tar.gz in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/build>
:sdks:python:sdist (Thread[Execution worker for ':',5,main]) completed. Took 6.784 secs.
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:installGcpTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:installGcpTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && pip install --retries 10 -e <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.17.0.dev0)
Collecting dill<0.3.1,>=0.3.0 (from apache-beam==2.17.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (1.24.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/54/95/bcbe5658d6ac65af35996a80ed66d82c50f9c0b36424f4758cd54dd08d73/pyarrow-0.14.1-cp35-cp35m-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9b/21/2b18339d24a2f73dcefb2f10f48aff6182e16da83e3a612684443c6cfb29/numpy-1.17.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
Installing collected packages: crcmod, dill, fastavro, future, docopt, certifi, chardet, idna, urllib3, requests, hdfs, httplib2, pbr, mock, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, pyyaml, avro-python3, numpy, pyarrow, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.6
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) completed. Took 18.239 secs.
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:integrationTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:integrationTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:integrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:186: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... 
IssueCommand timed out after 1200 seconds.  Process was killed by perfkitbenchmarker.
2019-10-01 13:20:05,709 619ca5b4 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-10-01 13:20:05,710 619ca5b4 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-10-01 13:20:05,710 619ca5b4 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/619ca5b4/pkb.log>
2019-10-01 13:20:05,710 619ca5b4 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/619ca5b4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #551

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/551/display/redirect>

Changes:


------------------------------------------
[...truncated 179.45 KB...]
Successfully started process 'command 'sh''
Collecting future==0.16.0
Installing collected packages: future
Successfully installed future-0.16.0
Create distribution tar file apache-beam.tar.gz in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/build>
:sdks:python:sdist (Thread[Execution worker for ':',5,main]) completed. Took 6.496 secs.
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:installGcpTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:installGcpTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && pip install --retries 10 -e <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.17.0.dev0)
Collecting dill<0.3.1,>=0.3.0 (from apache-beam==2.17.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (1.24.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/54/95/bcbe5658d6ac65af35996a80ed66d82c50f9c0b36424f4758cd54dd08d73/pyarrow-0.14.1-cp35-cp35m-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9b/21/2b18339d24a2f73dcefb2f10f48aff6182e16da83e3a612684443c6cfb29/numpy-1.17.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
Installing collected packages: crcmod, dill, fastavro, future, docopt, idna, urllib3, chardet, certifi, requests, hdfs, httplib2, pbr, mock, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, pyyaml, avro-python3, numpy, pyarrow, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.6
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) completed. Took 17.496 secs.
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:integrationTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:integrationTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:integrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:186: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... 
IssueCommand timed out after 1200 seconds.  Process was killed by perfkitbenchmarker.
2019-10-01 07:08:39,460 0b61739e MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-10-01 07:08:39,460 0b61739e MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-10-01 07:08:39,460 0b61739e MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/0b61739e/pkb.log>
2019-10-01 07:08:39,461 0b61739e MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/0b61739e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #550

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/550/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8321] fix Flink portable jar test

[valentyn] Restrict dill's upper bound.

[lostluck] Helper to get the value of a KV type


------------------------------------------
[...truncated 179.36 KB...]
Successfully started process 'command 'sh''
Collecting future==0.16.0
Installing collected packages: future
Successfully installed future-0.16.0
Create distribution tar file apache-beam.tar.gz in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/build>
:sdks:python:sdist (Thread[Execution worker for ':',5,main]) completed. Took 6.64 secs.
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:installGcpTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:installGcpTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && pip install --retries 10 -e <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.17.0.dev0)
Collecting dill<0.3.1,>=0.3.0 (from apache-beam==2.17.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (1.24.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/54/95/bcbe5658d6ac65af35996a80ed66d82c50f9c0b36424f4758cd54dd08d73/pyarrow-0.14.1-cp35-cp35m-manylinux2010_x86_64.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9b/21/2b18339d24a2f73dcefb2f10f48aff6182e16da83e3a612684443c6cfb29/numpy-1.17.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
Installing collected packages: crcmod, dill, fastavro, future, chardet, urllib3, certifi, idna, requests, docopt, hdfs, httplib2, pbr, mock, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, pyyaml, numpy, pyarrow, avro-python3, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.6
:sdks:python:test-suites:dataflow:py35:installGcpTest (Thread[Execution worker for ':',5,main]) completed. Took 20.299 secs.
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker for ':',5,main]) started.

> Task :sdks:python:test-suites:dataflow:py35:integrationTest
Caching disabled for task ':sdks:python:test-suites:dataflow:py35:integrationTest': Caching has not been enabled for the task
Task ':sdks:python:test-suites:dataflow:py35:integrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/test-suites/dataflow/py35> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:186: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... 
IssueCommand timed out after 1200 seconds.  Process was killed by perfkitbenchmarker.
2019-10-01 01:08:39,547 1be5f5e0 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-10-01 01:08:39,547 1be5f5e0 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-10-01 01:08:39,547 1be5f5e0 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/1be5f5e0/pkb.log>
2019-10-01 01:08:39,548 1be5f5e0 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/1be5f5e0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #549

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/549/display/redirect?page=changes>

Changes:

[ihr] Add clarification about authorized views

[kirillkozlov] [BEAM-8275] Beam SQL should support BigQuery in DIRECT_READ mode

[github] Addressed review comments

[github] Added a test for BigQuery SQL read in EXPORT mode

[lukecwik] [BEAM-6923] limit gcs buffer size to 1MB for artifact upload (#9647)


------------------------------------------
[...truncated 156.97 KB...]
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/l/G8UVX0l2EpajISkhIT3UFLdriqQYYkpcoKVKnLhqFHdt8EKbLqPdkWbj3Z19M7NWTCMKSeWY3nehd+l9l9738Xf0n+mbkWQjsGl+g8/H3tEc782b7/u+7+w+XXICkpGAUb9FSVJVgqSyzUUiqwEX1K6TOCatmK4IkmVUnOLzqQ3W9DNQ6EHR8W6yLMtv57hGdGRGAyg1vZIZS2FizHHE8a9FJLX9dpSSOHqS+l0RKWrDpLcXTTLBAyol7GGT3o3ahVrPqM+iVEnYOx4jTpjxakgxSKK4kPbC+WUcPquHbdiHAd7Q7IE9iJDnKsuVcSjhxqZxH6XbQzc18w24uZW3LsAtzsujVvSSirjtL2O7FKWrNrwBve/vwa2Otx8dtaOY+hlRzM8EbUeX4MCYB56hdSqrayTOcZ3ga1FIhb2kiIqCR/Xg4mgMDqLjN/bgNsez0fHAQkcIh4IwiuOqr5+2H3MSmnEbbjdoSyXgcB+OuHCHNzmyhKPeefzdkXO1mqJJVpGIE+nQCma3QtOwovigoVLJWrZeiVQliHke1gZw1WZm7zt5P/6dODFzcrYmqMxjTMWb8pZ3RKdH8yHFOASJ4ijt+CntYotovnnT4BLwBAGREk8/OMRbvAkcJrni8FbvBvyZRMnwfGVzYI10LYtJlMLbvEP6XIyI0E8JLuvEvOVrXhIFx8xk5a4Kb1emnDRPfLNQTh+fDeHtxtX2INzZsAwoAUeMYWosOWZMDht7mS8pgUepc5ONd2A23tkDp2Xo6TNKtIPppnfreGjDqO7y7tRRTTnDqTzR8ewY47u2iWM8yLytiXM30wSo9KDqsEPsDnbUw1UW1HJk5jgrU0UF1lA1V1FsPyw6eUJTtRiTgDIem+CPo6cZpp/3NAqmvZf24MQFmHW8aR0/Mu54xRRgbWX7uZBkcW1+WJ9mBO4b2zoz3LL9R9IsClZjGi6howVdSza8uwf3OybJIVEETu5kuLX8FC6xYQ4De08PHnAMxmsR7WrleHCM7oGgRCHMeRroUrLhIYfd7u3D9TpRuurhvX14nwsPO41Cw8L/UuNg/eZNy7psWZsF62rRWoL3N/tQnzZWeCihJQ1O9eG0J3CkxnhCaxdpuhqlctRWZEzWaK3LxarEc9CaPoa/SIXJdxrQZV04/goXYZ3nqVpY9hfX752tSRHUZLiqS0oxntZeAkJtAEI1W4d5E8oDMUlaIXkIzpw7X6hbcNa7TRND8MQX6FLXx1a0C0bKDERDTYEPbEBjWsEHXTg3hliHKp8ohTRomm1aeRQrPBKcNzDjtJ6FxQ34kAvumGmUZFwoP+FhHqO+LHkHdM2+Io+w3IdHXHjUuPfRNlC+Dysb4LnwGDvT3CmBAcUOPM4wfZipImaq1NjbqNevKCz9gnWxqPMVmnxdLVg97BYtOWNdxqmSFRYtNWFdtUw21aSew0VhyVotWqKhu+GEdWRJ7bHCSWtrXu0dzWx3SqbTLlmHsLmCPiykx4edZqNowAlpm6DQwUc0l7yncGSxzuOYGu6VebsssTLKU2G5GylWTvCOLCtGcCalZRpTXYhlEuibjIZlIssEDdJOTBVa6+RVy/ORkKqsuny0XpZpGmgKUaFt0OOxKXnsbvOswgUFHx1UVRxJBb6RTs0KxXks4Qlvj+7LOAooEHMdYH6h5d2Cv04nmVrfKjkIzHRMUwiNSprr57QQXABlhxW0vaLxDR0DxYhmzOyu9Q+ia3CRzRv99Xem6eq5Fwv1/VbxcGFP4WDhQGFfoVQoFSGeRqImLqTsMbbaxKuZNxVkLkAfhAuSPd4DZYS1G6Uh7/oJXi/6VkE1yHd7QdEaZq5xaZv7gcT+K6xtWJN96LpwaQPWe/Ak7voxFy4bdAZp0ne3llPoeUd1reAmc9qzb1zPjQ42tzYDT+0S4cfZWt5i+nkNnt6S15nrktdnGMrmFYedZFoLr/bgEw57cFgjAzUrNJbqE5c1S/soYhvTDHXr2mulW5vjuvXsuf8W2Fmd20+68CnM7bM6t59GlD/jwmfZ/4f0c+yl4H1+C7x7rgu8L2jwvjgC70s9+PLLwdvhKvgKovhVg+LXXisUnxtH8XlUf3aWLTAU9K8jlt9w4ZuI5fNN9vpUy29ptWSvG4X8toLvOMxnTzDCWixgITNq9l3WYYyhYn2PPfdqivXCzor1fc3qH7jwQ8zEC5rVP0JW/9iFn/Thpy78TCvWz3fRg18wLTq/dOFXG/DrHvwGDV904bfXUQ6/GyuH39NX/TZbMRvjnjb8Aen/xx78yRl8OkXSH91jf940Fwa+2HY6yN4U/rKbz+ES+9TAcnnYhb+i77+Zk2LW8iSPiU60fkWg8PdGwVB5gAF6/8du3gcr7DNGpweR44faP9H3v8w7hs6JVCTJfPxwaOGHhIB/Nwp5S8F/qv8D4kXWWQ==",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-30T18:57:28.994497Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-30_11_57_27-3562915867557812019'
 location: 'us-central1'
 name: 'beamapp-jenkins-0930185725-352803'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-30T18:57:28.994497Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-30_11_57_27-3562915867557812019]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_57_27-3562915867557812019?project=apache-beam-testing
root: INFO: Job 2019-09-30_11_57_27-3562915867557812019 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T18:57:32.076Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T18:57:32.548Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
root: INFO: 2019-09-30T18:57:33.157Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T18:57:33.193Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T18:57:33.223Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T18:57:33.265Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T18:57:33.292Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T18:57:33.390Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T18:57:33.431Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T18:57:33.469Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-30T18:57:33.504Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-30T18:57:33.539Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-30T18:57:33.573Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-30T18:57:33.606Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-30T18:57:33.639Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-30T18:57:33.675Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-30T18:57:33.710Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-30T18:57:33.746Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-30T18:57:33.775Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-30T18:57:33.804Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-30T18:57:33.838Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-30T18:57:33.860Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-30T18:57:33.892Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-30T18:57:33.918Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-30T18:57:33.943Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T18:57:33.964Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T18:57:33.991Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T18:57:34.022Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T18:57:34.165Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-30T18:57:34.236Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T18:57:34.270Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T18:57:34.282Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T18:57:34.306Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-30T18:57:34.318Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-f...
root: INFO: 2019-09-30T18:57:34.376Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T18:57:34.376Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-30T18:57:34.457Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-30T18:57:34.493Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-30T18:57:34.529Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T18:57:59.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T18:58:31.405Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T18:58:31.448Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T19:02:22.824Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:24.881Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:27.050Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:29.109Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:30.184Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:31.174Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:32.234Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:02:32.580Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T19:02:32.648Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-30T19:02:32.682Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed., Internal Issue (3682b21ebd00c6d9): 63963027:24514
root: INFO: 2019-09-30T19:02:32.750Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T19:02:32.860Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T19:02:32.923Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T19:02:32.951Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T19:04:48.463Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T19:04:48.507Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T19:04:48.540Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-30_11_57_27-3562915867557812019 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569869844195/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569869844195/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569869844195\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.04271101951599121 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 455.586s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 8m 23s

2019-09-30 19:05:01,790 d88248ee MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 19:05:01,791 d88248ee MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-30 19:05:01,793 d88248ee MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 19:05:01,793 d88248ee MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-30 19:05:01,794 d88248ee MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-30 19:05:01,794 d88248ee MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/d88248ee/pkb.log>
2019-09-30 19:05:01,794 d88248ee MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/d88248ee/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #548

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/548/display/redirect>

Changes:


------------------------------------------
[...truncated 156.75 KB...]
            "output_name": "out",
            "step_name": "SideInput-s19"
          },
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s20"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/l/3MQV1+46ByLQkJQ0gR7blLQyZVcxYHAMpU03CZiNN65ssHoEdVaa3VE80uhpRnZMsxSSruPS++DsXXrf9L6Pv6P/TN/Mem0WbJrf4POxpdWbeW/efN/3fUd6suKEJCMho0GbkqSucpLKjsgTWQ9FTu0G4Zy0OV3MSZbR/JQ4k9pgjT8FpR6UHf96y7ICtZrRgMWpklAZDYYDxl6PKEYjSuTSnjm3gOaHtNmGMYy0q9WD3Y6/D0OJQmWFMgEl7GmZ8HG6ZdrbKtbguraZG3QKTC3vyoyGYLf8PWjLchFSKeH6kTRigX9tIqkddOKU8PhxGqzksaI27PMrJlQKN7B9Rfs83Oi82lXRiyoWdrCA9/k4XbLhLZj0/h7c5Pj70bkTcxqkJKGBLDqd+CIcGIkgMvROZX2Z8IIGmOByHNHcnldExeGj2jg3tMFBDPzWHtzs+LswsPGAQz7+tOBtvj00GSzgcBjFnNcDfbUDLkhk7DYcMVuSKodb+nCrB28v2ibPUCRZjuBgNoMI7/DH0EwKJeCdZsFQYBbwrpH0jU1u3OwFMa/yOO02hMm3ivm+uwdH2/516J7EyUZu7zHJauDcjJM4hduMIS2SQDKSRxKONS3/iEZesyrF1HMScwwcpHQF71jq9677N+l96PkDeDUriYL3+behvXbM2RgqkvHjk1FNdNC0tYK2geMfGg3R5aI9jDNuBmu3b+t5+1ZpM6IY1o3q0r6f6RLd0YOaww7553BKV067rqJJVpPIbtKlNQxfwy3VlBjcqFTSzVZrsaqFXBSROyC5OzF5z4mpu6emjk/eOzXlYmUKjg1UZ4eZrphmc8Ao0RVxW0V7lJWpojkSuV6omNsn826R0FTNcRJSJrgpzXHMc4Lp653NkrnfRXtw93mYdPxxDQoy7njNdIG7uHWdSTLuntloEmOBe0aWzgwF7eCRNIvDJU6jeQw0o1vUhnt7MOWYqq3EaSRWggSrq4uK7XViJ5XRiZgWl7YpD+HBa7xtmDZcjYgicN92+WxmcQqn2HA/7vcDPXjA8W/UFA+1KOhCatjgg/6taNTe03rxwKw+jVvNtdBNL0/Ahwz8yzFd0amfHOm0MKdEIRmLNNSNbcOHHXbE34vzdVNoaYNGH055cNpplpoW/leaBxs3rFvWJctaL1lXytY8nGn14cFx4zVcFx7qw4yfo8VlIqHuBZouxakc3muSk2Xqroh8SeL2qavzD+ZobtichnRB8yxYFHnUEEWqZhaCudW7Jl2Zh66MljQDFROp+wrs3AF29WwVHjap3M9J0o7IA9CcPVdqWHDWv1k3QS6SIMeQur03s501Gmwg2lA4aK3BuXEFcx58ZASxLlUBQo+k9Mwy7SLmCrcE8wZmHNajsLAGj3jw6IhrnGQiV0EiooKjtC36B3Q5X1N+8PvwUQ8+ZsIH6BuqIICPr8EnPDjPmq3tChhSfIDHGJYPK1XGSlWae5qNxmWFylWyLpR1vSJTryslq4ePZUtOWJdwqGJFZUuNWVcsU021S4/hpKhiLZWtvKkfozHryLzabUW7rM1xtWc4svVQMQ+dinUIb5cxhoX0CJxWs2zAiWiHoC7AJzWX/CfQMtcQnFPDvaroVCX2afVYVF2JFasmeGJXFSM4ktIq5VTLQnVAfxpViawSdEi7nCr01sWrV8/EuVRVtSKG82WVpqGmEM21D0Y8ekwevcNc64DC2XZMM/JYKgiN8mtWKCG4hMjfrZ8lj0MK1JxEWF/omD48nWRqdbNToWuGOU2BmePBHIan81zkELNbFFzwyyY2LBkohjTjZnUtjZBchZQ9bHQ82J6mYvblUmO/VT5c2l06WDpQ2luqlCplyMaRqOBBzs4z0cL3D9lSoDwo+rDswQp7rAcXdxCyVTYt+/C4B59ag0s96KHjEx58mp1g00W7uApPbsrrxDXJ61MMZfOyo/3ZfUwL15UefMZh/1+l+uzkBnsHOlNqzjfGLmn+rKG8XB1nqCjrb5SifHZUUZ6e/W+JndWof86DzyPqT2vUv4DgfdGDL22C9+VN8O68JvC+osH76ivB+1oPvn4t4D3zKvC2EelnEcXnDIrPv1EovjCK4ouoy+wsm2Uotd9ALL/pwbcQyxdb7M2pY9/WOsbeNNr1HQXfdVjIIkZZh3UZY0ZnvseWGGeoJd9nL7yelry0vZb8QLP6hx78CCvxkmb1j5HVP/Hgp334mQc/11ryix205JdGS37lwa/X4Dc9eBkdf+vB7zbb4ff0db/NFk1AjGXDH5D6f+zBnxyjx/iO3u0iBVP4804BNqbYpwYHzMLGI/wFA/3VsG6QLsb4204xBjPsB81b2yAZ/KT7O0b4h9kwFq9ICk50vfUZTuGfzZI5wjWwUpEkC/CrpI3v+zn8C4fM12Qsg+Gp9+/1oq3gP/X/AQ1X04k=",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-30T13:06:50.404170Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-30_06_06_49-4361594281457277123'
 location: 'us-central1'
 name: 'beamapp-jenkins-0930130646-850719'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-30T13:06:50.404170Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-30_06_06_49-4361594281457277123]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_06_06_49-4361594281457277123?project=apache-beam-testing
root: INFO: Job 2019-09-30_06_06_49-4361594281457277123 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T13:06:52.270Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T13:06:52.784Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-09-30T13:06:53.365Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T13:06:53.404Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T13:06:53.435Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T13:06:53.471Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T13:06:53.505Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T13:06:53.606Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T13:06:53.672Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T13:06:53.704Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-30T13:06:53.738Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-30T13:06:53.775Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-30T13:06:53.812Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-30T13:06:53.853Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-30T13:06:53.889Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-30T13:06:53.917Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-30T13:06:53.955Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-30T13:06:53.989Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-30T13:06:54.021Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-30T13:06:54.058Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-30T13:06:54.091Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-30T13:06:54.128Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-30T13:06:54.156Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-30T13:06:54.192Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-30T13:06:54.219Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T13:06:54.254Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T13:06:54.280Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T13:06:54.318Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T13:06:54.472Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-30T13:06:54.546Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T13:06:54.581Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T13:06:54.595Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T13:06:54.616Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-a...
root: INFO: 2019-09-30T13:06:54.617Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-30T13:06:54.665Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-30T13:06:54.665Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T13:06:54.739Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-30T13:06:54.847Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-30T13:06:54.884Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T13:07:34.009Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T13:07:58.566Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T13:07:58.605Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T13:12:05.861Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:07.937Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:10.097Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:11.537Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:12.167Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:13.799Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T13:12:13.826Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T13:12:13.869Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-30T13:12:13.899Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S06:read/Read+split+pair_with_one+group/Reify+group/Write failed., Internal Issue (f19268492372cddc): 63963027:24514
root: INFO: 2019-09-30T13:12:14.252Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed.
root: INFO: 2019-09-30T13:12:14.288Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T13:12:14.400Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T13:12:14.450Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T13:12:14.471Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T13:16:40.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T13:16:40.783Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T13:16:40.823Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-30_06_06_49-4361594281457277123 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569848805788/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569848805788/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569848805788\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.03775334358215332 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 605.435s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10m 51s

2019-09-30 13:16:52,728 8797887c MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 13:16:52,730 8797887c MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-30 13:16:52,732 8797887c MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 13:16:52,732 8797887c MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-30 13:16:52,733 8797887c MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-30 13:16:52,733 8797887c MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/8797887c/pkb.log>
2019-09-30 13:16:52,733 8797887c MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/8797887c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #547

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/547/display/redirect>

Changes:


------------------------------------------
[...truncated 157.77 KB...]
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV3l/3MQZlnadAxFoSEqaQI8lJVSm7MoOJCQupE03CZiNN65ssHoEdVaa3VEsafRqRt6YZjmSrmPofXO09KDl6E3v+/gc/TJ9Z9Zrd8Gm+Q9+v0TjeWfeQ8/7zDPaJ8t2QDISMOq3KElqMiepaPM8EbWA59SqkzgmrZgu5CTLaH6Kn0ktMMafArMHJdvbZRiG3y5wT94RGQ2g3PSuVza5nFGfRakUMDaaABe0vRZSzEAkz4U1fW4ezQ8qswXbMPr2Zg922DpUlGaF1PEE7GzqjLyQG7brmsUKWC1vBy5kOQ+oEHD9SMqI478WEdTy21FK4ugx6nfzSFILdnll/QYp3MB2Fa3zcKP9RldJL8qIW/48jnNRumjBu7DA3T24yfZ2o3M7iqmfkoT6omi3o4uwZyQCz9A7FbUlEhfUxwKXopDm1pwkMgoeUcbZoQ32YuB39+Bm29uGgbUH7PPwTwPe41lDk35v2B+EURzXfPW0/JiTUNstOKBfScgcbunDrS68t2ht1JkRybAIqup8H1P53t+DD9hsn3cOt3TElONImmRVgW0hHVpFJlRpGlYlHwxUSOFky9VIVoOYF6EzaIUzeeTo8WOHj94zcWzi3kknp6KIsfMVtp+p9Kr0tEh8wUgeCritYXgHFOyKUinWnZMojtKOn9IujtjTg6sagYAjLPDBETy1TawN1jyfkzm61rkG8HZ8oUM9uKPl3aQwUOkGrVGMJhI+5N2O9uohe22pSMYnjoRV3kbTRoHKBrYmlM8oUTWMN73rcJpEyRr8d+qXUtxwsphEKXzY2zeashPz1jDvXXqxeuemmaq6PQFPsC9CIFkGGWreGJpJITk4RWuUlamkORK5Vsgotk7mnSKhqZyNSUAZjzUSE4jEJFPPww1Tj3fTHtxzHo7Y3rgqFBk3UdWnwFnYeE4nWeycWTsk2gJHR1JnmoKW/3CaRcFiTMM5DDStjqgF9/bgmK2rDokkcHwzx/Xtp3CLBVNY2Ed6cJ+tG9aN0pB3/QR5oeiAp/L+rcRJ1a+VQVgaaRL7b/K24IR3o8IwUKKguK9gg496t6JRBZxSUXwdZgpfNVdCN7U0CR/TvV+KaFfVcHLkpAU5JRIJVaSBOtgWfNxmB7yduF9xUMkY1PtwyoXTdsNsGPi/3Nhbv2HVMC4ZxqppXCkZc3Cm2YcHxrXXMC882IdpL0eLw3hCnQs0XYxSMRyrIiZL1OnyfFEgItRR9fuzNNcMSwM6r46mv8DzsM6LVE7P+7PLdx9xRB44IlxUh1Yynjr/A6czaEktW4aHdCn3xSRpheQENGbOmXUDzno3K93IeeLnGFJxf73aGS3EGqI1hYPmCpwblzDrwidGEOtQ6SP0SEpXp2kVUSzxlWBOw4zLahXmV+BhFx4ZcY2SjOfST3hYxChtC94e1c43sQq8PnzShU/p8D76BtL34dMr8BkXzrNGc7MGBhQn8CjD9mGnStipcmNHo16/LPFYm8aFkupXqPt1xTR6OC0ZYtK4hEtlIywZcsy4Yuhuym1qDTeFZWOxZOQNNQ3HjANzcrsRbjPW1+WO4crGpKwn7bKxD4fLGMNAevh2s1HS4IS0TVBK4bOKS97jaJmt8zimmnsV3q4IPKeVQ2GlG0lWSfDGrkhGcCWlFRpTJQuVAf1pWCGiQtAh7cRUordqXq1yJsqFrMguH+4XFZoGikI0Vz4Y8eAhcfAu/awBillrcMbjSEgItCwqVkjOYwGht13NRRwFFKi+ibC/0Nbn8HSSyeV1AYCOXo5pCkyLqb4MT+c5zyFit0i44JV0bFjUUAxpFuvsSpchuQope0hrq785TfnM62Z9t1Hab24395p7zJ1m2SyXIBtHooILOTvPeBO/NURTgnSh6MOSC132aA8ubqFIy+yE6MNjLnxuBS71oIeOj7vwRNFi97MTxVV4cl1eJ69JXp9iKJuXbXacKS280oPP2yoS+/8q1Wcn19g70BmzMVcfu6T4s4LycnWcoaKsvl2K8vSoojwz8x+TnVWof8GFLyLqzyjUv4TgfdmFr6yD99V18A5fE3hfU+B9fQjeN3rwzWsF71tvAG8Tkf42ovgdjeKzbxeKz42i+DzqMjvLZhhK7QuI5Xdd+B5i+XyTvTN17EWlY+wdo13fl/ADmwUsZJS1WYcxpnXmh2yRxQy15EfsubfSkpc215IfK1b/xIWXsRMvKVa/gqx+1YXX+vBTF36mtOTnW2jJL7SW/NKFX63Ar3vwOjr+xoXfrh+H39G3/G22oANiLAt+j/T/Qw/+aGs9xk/iTgcpmMKftgqwtsU6Nbhg5tem8GcM9BfNukG5GOOvW8UY7LAe0J9fg2Lw59vfMMLfBz8CI+EPL7B/rOq7WyEqJEkyH793W/itn8M/G6aGB1tdJEVMFDvUjU/hXw2zaEn4d+2/RbfTnQ==",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-30T06:53:39.438172Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-29_23_53_38-3233091081456319737'
 location: 'us-central1'
 name: 'beamapp-jenkins-0930065332-262802'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-30T06:53:39.438172Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-29_23_53_38-3233091081456319737]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-29_23_53_38-3233091081456319737?project=apache-beam-testing
root: INFO: Job 2019-09-29_23_53_38-3233091081456319737 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T06:53:41.841Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T06:53:42.260Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-09-30T06:53:42.828Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T06:53:42.856Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T06:53:42.883Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T06:53:42.925Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T06:53:42.961Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T06:53:43.063Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T06:53:43.105Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T06:53:43.136Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-30T06:53:43.164Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-30T06:53:43.195Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-30T06:53:43.224Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-30T06:53:43.260Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-30T06:53:43.293Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-30T06:53:43.332Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-30T06:53:43.363Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-30T06:53:43.401Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-30T06:53:43.436Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-30T06:53:43.469Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-30T06:53:43.500Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-30T06:53:43.529Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-30T06:53:43.554Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-30T06:53:43.590Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-30T06:53:43.623Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T06:53:43.651Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T06:53:43.687Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T06:53:43.713Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T06:53:43.845Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-30T06:53:43.912Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T06:53:43.938Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T06:53:43.953Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T06:53:43.964Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-30T06:53:43.974Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-a...
root: INFO: 2019-09-30T06:53:44.025Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T06:53:44.025Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-30T06:53:44.094Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-30T06:53:44.120Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-30T06:53:44.154Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T06:54:12.355Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T06:54:12.381Z: JOB_MESSAGE_DETAILED: Resized worker pool to 8, though goal was 10.  This could be a quota issue.
root: INFO: 2019-09-30T06:54:17.715Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T06:54:41.622Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T06:54:41.653Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T06:58:44.502Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:46.572Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:48.658Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:50.718Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:52.778Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:53.648Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:54.844Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T06:58:55.200Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T06:58:55.300Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-30T06:58:55.338Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed., Internal Issue (e88de25a8b44a85a): 63963027:24514
root: INFO: 2019-09-30T06:58:55.408Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T06:58:55.518Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T06:58:55.577Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T06:58:55.597Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T07:03:41.698Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T07:03:41.749Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T07:03:41.786Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-29_23_53_38-3233091081456319737 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569826408071/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569826408071/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569826408071\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.10852527618408203 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 627.728s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13m 17s

2019-09-30 07:04:00,063 862f1fd2 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 07:04:00,064 862f1fd2 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-30 07:04:00,068 862f1fd2 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 07:04:00,069 862f1fd2 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-30 07:04:00,076 862f1fd2 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-30 07:04:00,076 862f1fd2 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/862f1fd2/pkb.log>
2019-09-30 07:04:00,077 862f1fd2 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/862f1fd2/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #546

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/546/display/redirect>

Changes:


------------------------------------------
[...truncated 157.12 KB...]
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/l/3MQVl3btJIijISkhIT22KW5lyq7igFNiKJRuEuJss3Flg9VCqs5KszuKJY2eZuSNacSR1I7pfRd6l9536X0ff0f/mb6ZXdss2DS/wedjazTXm/e+7/u+o32magckIwGjfoeSpCFzkoouzxPRCHhOrSaJY9KJ6WJOsozmJ/np1AJj8lkwS6jY3k2GYfjdAtfkPZHRAKptbzeOZTkPqBAwNmI84vjXIYJafjdKSRw9Sf1+HklqwbhX1aZS2MXGvRvVu1zJqM+iVArYPeojTujxRkjRSSJ5LqzZ8ws4fEYNW7AHHbyhXYI18JAXMiukNijgxrY2H6VbQze1izW4uVN0LsAt9is9lvSSjLjlL2A7H6VLFrwJre8t4Vbb24uGulFM/ZQk1BdFtxtdgn0jFniGu1PRWCZxQX3EZTkKaW7NSyKj4DE1OLcxBvvR8JtLuM32LDQ82KE8hANBGMVxw1dPy485CfW4Bbdr2ITM4eAqHHLhDm98Yycc9vDVgLcUHe8GfEmiZGjtrdq8isvJYhKl8DY9kBaJLxjJQwFvbxneAWVYdQfB9WLe8RUxiISanqzfVefd+oS9tW/y6HQI79hCJSOSYchUoXKEqejeWcKdNjvA7mCHvfO4rCdmHEfSJKsLzCLp0ToeUadpWJd80FAhhZOt1CNZD2JehM4gmc7U9PET9x29d/r4selj005ORREjUSYw2FtHHR/6/C7vTuXzhD2cKhLl7bYRvFtHEPAEXRcCszeAzfbGcJgUksOkJrnPKMG8wV1tjXrAVec9I9nXY2LYWAt8XuZR2mtyne67EZB6CY2Od0hZUwWWYmJzEsW4yE9pH1ukp7NedEZZmUqaY/00ChnF1sN5r0hoKudiElDGY237KNqeYup5rGXq9h5awr0XYNr2JhVAyLijdV18zuLWczbJYuf0sDb1CBwfOTrT3LL8R9MsCpZiGs6joVlVSxa8t4T7bA1/P0pD3vcTDElFglV9YieVUY7oWhSW5hiJ/VfttmBGYx8SSeD+7fzZ9OIkLrHgAYz3fSU8aOs8LUe0r3x4aKSKgpwSifQo0kBVqAXvt9nt3h5cr9KjxAQeXoUPuNC0W2bLwP9qa3/z5nXDuGwY66ZxtWLMw8n2Kpya1LsQq1wpJZxehUe8HEccxhPqXKTpUpSKjbYuYrJMnT7PlwTGQR0Vhj9Hc83TNKALivH+Is/DJi9SObvgz63cM+2IPHBEuKRqQTKeOi8DwRmA0MhW4Ix25YGYJJ2QPAiz586bTQPOerepksx54udoUgnBprctrZAaoqFUwQfX4NykhLYL50cQ61HpEymRXXP6mE4RxRJDgg9pmHFazYK7BvMuLIxsjZKM59JPeFjEKFuPevtUKb0qj/DYKiy64GnzPu4NpO/Dh9fgIy48zmbb2yUwoNiBJximDzNVwUxVW7tbzeYViRpnGhcrKl+hztdV0yixWzHElHEZp6pGWDHkmHHV0NmU42oOF4VVY6li5C3VDceMQ/NylxGOG5vzcvfGzFanqjvdqnEAmytow0B6XLDbrYoGJ6RdggoFH1Vc8p7Ckbkmj2OquVfj3ZrAgqtNhLV+JFktwau3JhnBmZTWaExVfddIoO5UGtaIqBHckPZiKnG3Sl6jdjrKhazJPt9YL2o0DRSFaK72oMUjE+LI3frZAF/CxwaKFkdCAtF3hGKF5DwW0PF2qb6Io4BCoG8ZzC+E3i34dirJ5MpmyQHV0zFNoasvEn2rncpznkOPHZTAvIq2DZGGYoNmF/XpSkNh6RrE7Iy+VfztaZqce8ls7jUqB81d5n5zn7nHrJrVCqSTSFTuQsYeZ0kbb3xoS8hdEKsgXSjYEyUs76BIfTaDqy65sLIGT5bwcdx42YWSnWAzOsgB2uoOU2ILT3mHFeWRqzNKsnytWTMb/s0sT8HTRae4Bs9syuvUdcnrswxl84qtjmX3MyVcV0v4hM0eGhJ6ID1ma745dllRahUVZ22Sochce71EZn1UZJ4791+TnVWJ+KQLn8JEPKcS8WnE8zMufFYH9v/B+5wG7/Ob4B27LvC+oMD74svB+1IJX34leNvo9lcQxa9qFL/2eqH4/CiKL6BUs7OsxVB9v45YfsOFbyKWL7TZG1PavqWkjb1h5OzbEr5jM8I6LGAho6zLtPR8l0XsIkN5+R57/rXk5cXt5eX7itU/cOGHmIkXFat/hKz+sQs/WYWfuvAzJS8/30FefqHl5Zcu/GoNfl3Cb3DjSy789jrL4Xe6HH5PX/O32aI+FM+z4A9I/T+W8Cdbyzh+ZPZ6SNMU/ryTgeES6+TgXloYduEvaOivg99fkfA3bq2/resoMWNFUsREJVnd5RT+3jI1jQfx44H/2OnAwQrrEf2JN/Acf6j9E4/7l/4YUPkQkiSZjx/eHfzuzeHfLbPoSPhP43+EYtOa",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-30T00:49:27.330371Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-29_17_49_25-16544735406404762763'
 location: 'us-central1'
 name: 'beamapp-jenkins-0930004923-837383'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-30T00:49:27.330371Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-29_17_49_25-16544735406404762763]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-29_17_49_25-16544735406404762763?project=apache-beam-testing
root: INFO: Job 2019-09-29_17_49_25-16544735406404762763 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T00:49:29.862Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T00:49:30.320Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-09-30T00:49:31.454Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T00:49:31.475Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T00:49:31.498Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T00:49:31.529Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T00:49:31.556Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T00:49:31.623Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T00:49:31.663Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T00:49:31.684Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-30T00:49:31.716Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-30T00:49:31.740Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-30T00:49:31.763Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-30T00:49:31.786Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-30T00:49:31.810Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-30T00:49:31.834Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-30T00:49:31.859Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-30T00:49:31.885Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-30T00:49:31.920Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-30T00:49:31.956Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-30T00:49:31.983Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-30T00:49:32.010Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-30T00:49:32.037Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-30T00:49:32.076Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-30T00:49:32.115Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T00:49:32.151Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T00:49:32.186Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T00:49:32.211Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T00:49:32.360Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-30T00:49:32.411Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T00:49:32.448Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T00:49:32.453Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T00:49:32.467Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-30T00:49:32.495Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-a...
root: INFO: 2019-09-30T00:49:32.556Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T00:49:32.556Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-30T00:49:32.641Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-30T00:49:32.678Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-30T00:49:32.745Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T00:49:58.943Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T00:49:58.968Z: JOB_MESSAGE_DETAILED: Resized worker pool to 8, though goal was 10.  This could be a quota issue.
root: INFO: 2019-09-30T00:50:04.331Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T00:50:36.179Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T00:50:36.214Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T00:54:43.431Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:45.513Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:46.014Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:47.586Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:48.085Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:49.658Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:49.759Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T00:54:49.783Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T00:54:49.852Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-30T00:54:49.889Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S06:read/Read+split+pair_with_one+group/Reify+group/Write failed., Internal Issue (1a19a8f93734900e): 63963027:24514
root: INFO: 2019-09-30T00:54:50.231Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed.
root: INFO: 2019-09-30T00:54:50.266Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T00:54:50.376Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T00:54:50.449Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T00:54:50.483Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T00:58:45.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T00:58:45.509Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T00:58:45.537Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-29_17_49_25-16544735406404762763 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569804562525/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569804562525/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569804562525\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.05009269714355469 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 575.656s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10m 21s

2019-09-30 00:58:59,638 809b16c2 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 00:58:59,640 809b16c2 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-30 00:58:59,642 809b16c2 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 00:58:59,642 809b16c2 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-30 00:58:59,642 809b16c2 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-30 00:58:59,643 809b16c2 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/809b16c2/pkb.log>
2019-09-30 00:58:59,643 809b16c2 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py35/ws/runs/809b16c2/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org