You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/30 19:03:48 UTC

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py37 #527

See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/527/display/redirect?page=changes>

Changes:

[ihr] Add clarification about authorized views

[kirillkozlov] [BEAM-8275] Beam SQL should support BigQuery in DIRECT_READ mode

[github] Addressed review comments

[github] Added a test for BigQuery SQL read in EXPORT mode

[lukecwik] [BEAM-6923] limit gcs buffer size to 1MB for artifact upload (#9647)


------------------------------------------
[...truncated 156.44 KB...]
        ],
        "non_parallel_inputs": {
          "side0-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s18"
          },
          "side1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s19"
          },
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference",
            "output_name": "out",
            "step_name": "SideInput-s20"
          }
        },
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNV/l/G8UV10rOwZKmAUogpIdISbumSEoIOexSSKokYBQr7trg7RGW0e5Is3h3Z9/MrBVTiwKpHBd639ArKb1vet/H39F/pm9GklOBTfNb+vnYu6s3897M+77v+87ucyUnIBkJGPVblCRVJUgq21wkshpwQe06iWPSiumiIFlGxWl+NrWhMPk8WD0oOl6pUCj47RRKY0Eijn8tIqntt6OUxNEz1O+KSFEbJrxdxiXHsKIjMxrAtqa3A22Z4AGVErazCe9mPUetZNRnUaok7BjfIw4YezWkuEmiuJD2zPkFND+qzTbsxA3e1OyB7ZhQUZrlysSTcHPT7IDn6pptVzNfg7e08tYF2O28PhNFL6qI2/4C3uejdMmGt2L0PT24xfH2YKR2FFM/I4r5maDt6CLcOhaBZ+idyuoyiXOcJ/hyFFJhzyuiouAJbZwb2eA2DPy2HtzueDYGHnjoHcLeIIziuOrrq+3HnITGbsMdpgJSCbizD/tcuMvbNvKE/d55fO7I6VpN0SSrSMSJdGgFq1uhaVhRfHCjUslatlKJVCWIeR7WBtjUDh89NnXi2NTxqUMnHjhSE1TmMZbi7XnrWtopSagv87ZO+x1Mb/+dPXiXw/ayu9h+D2cVoIzzdTZpnviSERFKuLtRMNsMOGYNB8bgMjY5vNkLfF6JKO3UucHn3bjAPT042PJu0VnraIMtaMISBe/x7kF75aAzHMqTyUNHwwpvo+na+toG7/X2jofoxLw1iuOYwcq9m3pOmvQDnmC5pcTaDkp0rzeBZpIrDu/zbsLHJEqG1bvPAKB5VMtiEqVQ8fZpguuOSrGSgkQxJumntIt35GN13XSEzyjRCNWaeWuclamiAvuqmqsotk+JTp7QVM3FJKCMxwaqQwjVYaav9zcscz9Ce/DABTjqeJM6c2TcoYppytritetMksW1s8OeNRY4NrZ0Zrhl+4+nWRQsxTScx0Azur9sON6DE46BISSKwNRmjhvTT+MUG6ZxY+/vwYOOt1ujF2gF0M2k84MPePvRqL2n9XZ908bTuCehFWl6+TA8ZIjQjdKQd/0E8dQwohw9vJWm6TgmjLRNxUnsv8HbhpMG/uWIdnWwU2PNFwhKFDIuTwPd2DZ80GF3eDtxviap1iCo9+G0C2echtUo4H+pcVvdXi+sFtatS8X5Apxt9uGRSeMyygUe7cOMJ9BSYzyhtadpuhSlcnSvyJgs01qXiyWJedGaTsufo8LwNQ3ogu5hf5GLsM7zVM0s+HMrR47XpAhqMlzS3a0YT2v/BUptUI9qtgKPma08GJOkFZKHoDF73qoX4Jx3u+5ywRNfYEhN5o3dzhoJNfgM5Q2aa3B+UsGcCx8ag6tDlY/lREa6ZplWHsUKU4J5gzEO61FYWIPHXXhizDVKMi6Un/Awj1HqFr1bNUXeQCnw+vBhFz5iwvvoGyjfh4+uwcdcuMAazc2qF1D8AU8yrB2WqYhlKjV2NOr1p1RBWVeKq4XQulq8ZPWKq0V5YNW6UgqLauJS4SqWT21bxZGwtFQUx1aL4cS+Atq2h9sGdrVjYBs+lfRTu7QXry9YIc4E32k2iib1kLYJKio8pWniPYuWuTqPY2poVebtssQWLB8My91IsXKCh3FZMYIjKS3TmOqOLw8ahoZlIssEHdJOTBV669JUy2cjIVVZdflovizTNNAEoUL7YMQDB+WB+8y1Cih8rUH7xpFUEBgV0zVXnMcSQm+7/i3jKKBAzbmD1YO26dwzSaZWNnobOmY4pikwo33mnDsjBBcQsTsVPO0VTWxYMlCMSBSb1bXqQXIZUvaY0WF/cxLy2des+p5CcY81Ye22dlk7raJVLEI2iTQEFwS7wHgT3wFkU4FyIe/Dsgtd9mQPLm6hGivspOzDMy58fA1We9BDx2dd+ETeYg+zk/lleG5DOQ9fl3I+z1ARX3DYFNMyd6kHn3TY/9a0vl6NnRpycyAhVmO+PrGq+bOG4nF5kqFerN8ovfjUuF68OPtvi53TqL/kwqcR9Rc16p9B8D7rwuc2wPv8Bnj3Xxd4X9DgfXEE3pd68OXrAe8rm4D3ev39KkL4NQPh128UhC+PQ/gKSi47x2YZqug3EMhvuvAtBPKVJrsREvVtLVHs/0aWvqPgisMCFjLK2qzDGDMScpUtsZihTHyXvfxmMvHq5jLxPU3Y77vwA8T5VU3YHyJhf+TCj/vwExd+qmXiZ1vIxM+NTPzChV+uwa968Bo6/tqF32ww/bf0Tb+oFk1AjGXD75DZv+/BHxxDh8FKuMAft/IfzLAfMS8ugzj4sfMnjPJnI9b4rtzpIEVT+MtWIYZT7NOD02dh+BP+ikH+ZhLG4uVJHhNdb334Uvh7wzJnrwZWKpJkPr79tvBNVcA/cMh8rEXSHx1o/1zPWwr+Vf0PFMnUTA==",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-30T18:56:34.742152Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-30_11_56_33-452774739975975206'
 location: 'us-central1'
 name: 'beamapp-jenkins-0930185631-755388'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-30T18:56:34.742152Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-30_11_56_33-452774739975975206]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_56_33-452774739975975206?project=apache-beam-testing
root: INFO: Job 2019-09-30_11_56_33-452774739975975206 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T18:56:36.829Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T18:56:37.284Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
root: INFO: 2019-09-30T18:56:37.852Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T18:56:37.886Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T18:56:37.921Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-30T18:56:37.962Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T18:56:37.984Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T18:56:38.073Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T18:56:38.125Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T18:56:38.159Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-09-30T18:56:38.191Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-09-30T18:56:38.222Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-09-30T18:56:38.250Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-09-30T18:56:38.287Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-09-30T18:56:38.315Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-09-30T18:56:38.344Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-09-30T18:56:38.378Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-30T18:56:38.408Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-30T18:56:38.443Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-30T18:56:38.476Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-30T18:56:38.511Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-30T18:56:38.541Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-30T18:56:38.576Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-30T18:56:38.613Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-30T18:56:38.651Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T18:56:38.682Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T18:56:38.713Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T18:56:38.747Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T18:56:38.883Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-09-30T18:56:38.952Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T18:56:39.005Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T18:56:39.019Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T18:56:39.027Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-09-30T18:56:39.052Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-f...
root: INFO: 2019-09-30T18:56:39.108Z: JOB_MESSAGE_BASIC: Finished operation group/Create
root: INFO: 2019-09-30T18:56:39.108Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-30T18:56:39.173Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-30T18:56:39.196Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-09-30T18:56:39.277Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T18:56:59.952Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T18:56:59.985Z: JOB_MESSAGE_DETAILED: Resized worker pool to 1, though goal was 10.  This could be a quota issue.
root: INFO: 2019-09-30T18:57:05.594Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T18:57:33.890Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T18:57:33.925Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T19:01:34.865Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.7/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:01:36.394Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.7/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:01:36.482Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.7/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:01:37.457Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.7/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:01:37.513Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 489, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/internal/pickler.py", line 287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.7/site-packages/dill/_dill.py", line 410, in load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were given

root: INFO: 2019-09-30T19:01:37.539Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-30T19:01:37.608Z: JOB_MESSAGE_DEBUG: Executing failure step failure25
root: INFO: 2019-09-30T19:01:37.668Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S06:read/Read+split+pair_with_one+group/Reify+group/Write failed., Internal Issue (e22c9f22a8f03221): 63963027:24514
root: INFO: 2019-09-30T19:01:38.162Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed.
root: INFO: 2019-09-30T19:01:38.195Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-30T19:01:38.300Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T19:01:38.355Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T19:01:38.393Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T19:03:37.813Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T19:03:37.863Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T19:03:37.903Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-30_11_56_33-452774739975975206 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569869790843/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569869790843/results*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1569869790843/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.06061697006225586 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 434.606s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 8m 2s

2019-09-30 19:03:47,031 b87e9b7d MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 19:03:47,032 b87e9b7d MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-30 19:03:47,034 b87e9b7d MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-30 19:03:47,034 b87e9b7d MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-09-30 19:03:47,035 b87e9b7d MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-30 19:03:47,035 b87e9b7d MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/runs/b87e9b7d/pkb.log>
2019-09-30 19:03:47,035 b87e9b7d MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/runs/b87e9b7d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py37 #528

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/528/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org