You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/05/02 00:30:01 UTC

Build failed in Jenkins: beam_PerformanceTests_Python #2358

See <https://builds.apache.org/job/beam_PerformanceTests_Python/2358/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7176] don't reuse Spark context in tests (causes OOM)

[ehudm] Add pip check invocations to all tox environments.

[iemejia] [BEAM-7196] Add Display Data to FileIO Match/MatchAll

[migryz] Downgrade logging level to avoid log spam

[kcweaver] [BEAM-7201] Go README: update task name

[lcwik] [BEAM-7179] Correct file extension (#8434)

------------------------------------------
[...truncated 227.19 KB...]
        }, 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)", 
        "windowing_strategy": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01"
      }
    }, 
    {
      "kind": "CollectionToSingleton", 
      "name": "SideInput-s20", 
      "properties": {
        "output_info": [
          {
            "encoding": {
              "@type": "kind:stream", 
              "component_encodings": [
                {
                  "@type": "kind:windowed_value", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }, 
                    {
                      "@type": "kind:global_window"
                    }
                  ], 
                  "is_wrapper": true
                }
              ], 
              "is_stream_like": {
                "value": true
              }
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s17"
        }, 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)", 
        "windowing_strategy": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_finalize_write"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {
          "side0-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s18"
          }, 
          "side1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s19"
          }, 
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s20"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 2364 bytes>", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-05-02T00:20:43.981132Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-05-01_17_20_42-17136829585966650984'
 location: u'us-central1'
 name: u'beamapp-jenkins-0502002041-115329'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-05-02T00:20:43.981132Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-05-01_17_20_42-17136829585966650984]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-01_17_20_42-17136829585966650984?project=apache-beam-testing
root: INFO: Job 2019-05-01_17_20_42-17136829585966650984 is in state JOB_STATE_RUNNING
root: INFO: 2019-05-02T00:20:42.692Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-05-01_17_20_42-17136829585966650984. The number of workers will be between 1 and 1000.
root: INFO: 2019-05-02T00:20:42.867Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-05-01_17_20_42-17136829585966650984.
root: INFO: 2019-05-02T00:20:46.019Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-05-02T00:20:46.441Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-05-02T00:20:47.091Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-05-02T00:20:47.145Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-05-02T00:20:47.236Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-05-02T00:20:47.288Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-05-02T00:20:47.343Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-05-02T00:20:47.548Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-05-02T00:20:47.594Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-05-02T00:20:47.638Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
root: INFO: 2019-05-02T00:20:47.686Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2019-05-02T00:20:47.734Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-05-02T00:20:47.773Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-05-02T00:20:47.823Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-05-02T00:20:47.866Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-05-02T00:20:47.922Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-05-02T00:20:48.007Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-05-02T00:20:48.061Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
root: INFO: 2019-05-02T00:20:48.107Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-05-02T00:20:48.146Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-05-02T00:20:48.185Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-05-02T00:20:48.242Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-05-02T00:20:48.280Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-05-02T00:20:48.316Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-05-02T00:20:48.363Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-05-02T00:20:48.405Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-05-02T00:20:48.439Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-05-02T00:20:48.487Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-05-02T00:20:48.695Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-05-02T00:20:48.793Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-05-02T00:20:48.835Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-05-02T00:20:48.847Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-05-02T00:20:48.877Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-05-02T00:20:48.902Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-05-02T00:20:49.024Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-05-02T00:20:49.070Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-05-02T00:20:49.166Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-05-02T00:22:37.783Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-05-02T00:23:20.921Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-05-02T00:23:20.956Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_Python/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 557.374s

FAILED (failures=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10m 6s

2019-05-02 00:29:59,443 f47897e4 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-02 00:29:59,444 f47897e4 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-02 00:29:59,446 f47897e4 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-02 00:29:59,447 f47897e4 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-02 00:29:59,447 f47897e4 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-02 00:29:59,447 f47897e4 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python/ws/runs/f47897e4/pkb.log>
2019-05-02 00:29:59,447 f47897e4 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python/ws/runs/f47897e4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Python #2359

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python/2359/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org