You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/04/25 01:11:44 UTC

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #1344

See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/1344/display/redirect>

Changes:


------------------------------------------
[...truncated 153.84 KB...]
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNWHd8HEcVvjvJbe04LsRgUriYGFYhd+eC7WBCQiJbjjl8VlZKvBSz2dudu1lr25uZlaygTeI4KpgAAUJoppfQO4Tee0nomA6hE3oN1byZvZN8Qhb+z/wk7e69nfftvPe997493dilO3ZsO5RYdWIHZcHskDciFvCyEzGi9dq+b9d9so/ZcUzYjqgv1CDXcwjyKRR0syuXy1mNELo6QLwIf+s2J5rV8ELb964j1gjzBNGg21ymXBKEZU0eEwcW1MxFaItZ5BDOYSHtNpfKNWI0Jhb1QsFhkeN6vl+25FGzHEZsQazQDogrkthH2MXKZffeQfS5UrksURYvjBOhkDho6tFRImZMS82F0sS8phfCskk4wyzi54584Dq1h7JLMCG2iBiH5ZNwpgErarX9sHISVhmwOqnvh/vps5MgyEHhRZo1iOcBLxzS4CzM3JoU7q+bK/BBDc8nVmwLasWMNLyD8IAOhChG75CXh20/wXUsGvZcwrQBYQvPuUYa+9s2WIvAD0zhbN3UEDjzkHuHczpS50e2q+wanKvI44LBeePwIAOK5oK2J5xv7sXrJt9eqQgSxCWOYdtNUsLCKJHQLYkoOxEueCUeLXmi5PhR4lay7FY2brl427ZtWzdt3rZpy8YKIzzxkZJ1SX0mbMmexZOGDPvBVG7/ghTW6/QcWqTnm7gqBw/B9TKaMAksTm3mcnhoNae26UQYNXQmXNl466QNRleM4v56I5WeHsS/MIWH1c2VMmgJlu1Alrot4CLzArSX1uutW0nQs2GLW4oaaJp5vLRByVzTCdH0o3obp6xuli6c07OioneiANnmHKnNGNpgdsuiS0QEG80leBl4QYu8TSp+WUaV2LexRjeba2VryF4MkUhme74XNq2QjOAZC/rhU6qXLEpsmaAttaTeWZShIAw7spwIz9cuZ80kIKHo922H0MhXqdqKqdpG5fHial6dH0FS2L4fHqmbW2WfjgqKW+dYd5Zqrw0l1duVfTPH3UHsV/para8scEnHPmJVZ5p1dRh7zpBP3AGE2y3RNHhUCpfqKieuLWy4bC7H6eU7cIkGj8ZdXp7CFbq5XKbSkYNENpYMFnrNs9EovbfPbHo77onJwbZ9eCPsUFUx4oVuNGIFmFyZU5xqO082GiWOguGaot/2rf/y1qBPcTHskREJtmuuGdZIQkc2uQZX6vRcczGulxUrxxjsHofHGFDVq/lqDv+6qqt7tancWG4qf7gwkIPH1sZhT49yaccCaNprMrRUaBSQygESDnkhb59L3LeHSWUkYkMc4yIVGZbVT5gq3tAhg7KfrX0Rc3ujJBS7B63+0c1bK5w5Fe4OyU6X1FdOSEol46Mcj0K/2solvh3UXftSuGrPkXxvDgzzLNnxLAoshpCysqd3O6AGsspPa9TB4ARc3SPgGgP2daSrSYSFdGJ5muox9cTzBYYEj1M5xtvyLjx+Ap5gwBM7XL0gjpiwgshNpFLsN1fNGvFZCPCkcbAMuFbBW+jrCMsCewLqBjj0qtpc7DkEP4BLkTukqYA0dVUXVXt7rxU5kT9QGMu5SNXhfFoYK/B1Y/kDXW5BdB/OSfrEgjG843YNFdjWsYLbvTaHtoXugswuFmW21lWXvGp0rcHjTXkXVwLRa9WCCt0lDRunKzRkmZjXo6W/N/J9osqqGDWKHFuwuN4tjniCFgPU9KKgNt4JSZH4RLZ/MWsY4hZtXrTRIWz6RKC3pKZc7PMYF0UxErXX8yIJHVkghEkfRFy3nq+7SB3L0BRAs/b1PS7AUyNNci6iyOdwQEmux33PITCkNAjZA1917s4gFqPTvQ2Buu2TEEI1CJXm7WQsYhDR8wTEZkFhA6hUtIuIqafLEQh8EgTtV0PZmrsIkz335XtX5Aor8t355fll+cX5Qr5QgOEeLMMRAw5Shya1WgqjNQHXGfDkcRgzIKVuCtfrJxkbN9C+E0TqxnlEap/yJa4KrCVVh3CW3ZTCYd1cLdHVe5drZYqeId48D2KfzUU/8wJPeMPT6jeOkBMpTJpnqjzgK13ddoZaaFPzoPWr4dxCeQqiHEnhqXVeV0Ayj1zYQdwCumU+NW6vbWE9DbGensIz6moEtFKYwdw6D8wuNWuzrLWQnolIz0rh2aqssjTBbSeb29mDOmA0eA4i3J7Cc1XVWKir8LwOfymVvDwd7UwsGjwfXV+QwgvVwwPPYRGHo0cWHzuq33Pv8eM38Lr8SerJOLzIgBdPwEtSeCnW0csMeHlSpztpH9Wp5PsVKbxSpzdTSdSrUng1naIy23ek8BoEoLdQma/XpvC6Or2Vyohfn8Ib6G30doU+CW+cW5k3npIyv4mi4r5Zp5dRKaNvSeGtOv3fmvk2tf1drdmXSVS+OtDbPSbn09tRid7RQ/H4ztOlR+/q1KM796wrUEN29bsNeA929Z2yq9+LbLzPgPd3sPGBFD7YZuNDKXwY2bhjmoWPpPDRNgsfS+HjJ7DwiblZ2HRKLHxSsvCpNgufTuEzp8LCZ+dgYfaLwueQhc8rLr5wuri4q5OLu/HdgBp0gKLcfxEZ+ZIBX0ZG7q7R06GlX5FaSv9v9POrAr6mU48eoEPUpwENqdK6r1OgjKKefYPeNZ+eHZtbz74pK/9bBnwb83xMVv53sPK/a8D3xuH7BvxA6tkPT6Zn93To2Y9Uj/w4hZ+0e+SnKfysNbF+nsIvZibWL1O4t90rv0rh19O9Mg6/MeC3E/C7FH6P+/iDAX/s6MA/pfDnNvpfUvhrRwfel8Lf2qh/T+EfJ3TgP8m8/8zIJj5GpcG/0PnfKRzXVWVmMTdClstjmKcuHlxjefToOcQKeFJvOoJ5zSa2Tci65sFqrdJ2ZG9vg62PrDtDWyDRVmZfZZIg8W1ZjPIVlrCFeKuaV98kUcfly4OFtwl++eNs0ewnzpauHQlTUBpbnD1oCZ5SpuGRHq3mUNhXzRL2oI7fKxlbmj1V/V/G41brpZMtQ/NUUhfsDLwo/we+A2B6",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-04-25T00:57:21.527304Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-04-24_17_57_20-522771475528602113'
 location: 'us-central1'
 name: 'beamapp-jenkins-0425005718-696933'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-04-25T00:57:21.527304Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-04-24_17_57_20-522771475528602113]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-04-24_17_57_20-522771475528602113
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-24_17_57_20-522771475528602113?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-24_17_57_20-522771475528602113 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:23.475Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:23.930Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.591Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.623Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.651Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.682Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.710Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.783Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.824Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.854Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.882Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.912Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.936Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.966Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:24.992Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.015Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.043Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.071Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.099Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.126Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/Pair
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.154Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.184Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.210Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.232Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.259Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.300Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.339Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.376Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.562Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.618Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.644Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.660Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.671Z: JOB_MESSAGE_BASIC: Executing operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.682Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.726Z: JOB_MESSAGE_BASIC: Finished operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.726Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.783Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.815Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:25.875Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:51.099Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:55.810Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 6 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:57:55.840Z: JOB_MESSAGE_DETAILED: Resized worker pool to 6, though goal was 10.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:58:01.314Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 9 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:58:01.353Z: JOB_MESSAGE_DETAILED: Resized worker pool to 9, though goal was 10.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:58:33.647Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:59:27.030Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T00:59:27.062Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.170Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.247Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.283Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.356Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.391Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.412Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.431Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.456Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.481Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.487Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.528Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:02:44.560Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:03:25.523Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:04:53.492Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:04:53.589Z: JOB_MESSAGE_BASIC: Executing operation group/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:04:53.633Z: JOB_MESSAGE_BASIC: Finished operation group/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:04:53.696Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:44.535Z: JOB_MESSAGE_BASIC: Finished operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:44.609Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:44.657Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:44.725Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.047Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.141Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/Extract.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.211Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.253Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.283Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.302Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.351Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.381Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:48.452Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:50.819Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/PreFinalize
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:50.896Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:50.952Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:50.997Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:51.058Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:51.116Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:53.387Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:53.460Z: JOB_MESSAGE_DEBUG: Executing success step success24
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:53.574Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:53.646Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:07:53.676Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:09:35.089Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 10 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:09:35.128Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-25T01:09:35.153Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-24_17_57_20-522771475528602113 is in state JOB_STATE_DONE
DEBUG:apache_beam.io.filesystem:Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results'
DEBUG:apache_beam.io.filesystem:translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1587776237251\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 30 files in 0.08020639419555664 seconds.
INFO:apache_beam.testing.pipeline_verifiers:Find 30 files in gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*-of-*: 
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00000-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00001-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00002-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00003-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00004-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00005-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00006-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00007-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00008-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00009-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00010-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00011-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00012-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00013-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00014-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00015-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00016-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00017-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00018-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00019-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00020-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00021-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00022-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00023-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00024-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00025-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00026-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00027-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00028-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00029-of-00030
INFO:apache_beam.testing.pipeline_verifiers:Read from given path gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*-of-*, 26186927 lines, checksum: ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710.
INFO:root:average word length: 19
DEBUG:apache_beam.io.filesystem:Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results'
DEBUG:apache_beam.io.filesystem:translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1587776237251\\/results[^/\\\\]*'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 30 files in 0.06592822074890137 seconds.
error: [Errno 111] Connection refused
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/src/sdks/python/scripts/run_integration_test.sh>: line 278: 18203 Terminated              python setup.py nosetests --test-pipeline-options="$PIPELINE_OPTS" --with-xunitmp --xunitmp-file=$XUNIT_FILE --ignore-files '.*py3\d?\.py$' $TEST_OPTS
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=f2204858-b39d-4e77-90f4-e0af06ac9f38, currentDir=<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 30284
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-30284.out.log
----- Last  20 lines from daemon log file - daemon-30284.out.log -----
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00021-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00022-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00023-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00024-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00025-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00026-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00027-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00028-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results-00029-of-00030
INFO:apache_beam.testing.pipeline_verifiers:Read from given path gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*-of-*, 26186927 lines, checksum: ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710.
INFO:root:average word length: 19
DEBUG:apache_beam.io.filesystem:Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results'
DEBUG:apache_beam.io.filesystem:translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1587776237251/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1587776237251\\/results[^/\\\\]*'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 30 files in 0.06592822074890137 seconds.
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-24_17_57_20-522771475528602113?project=apache-beam-testing
error: [Errno 111] Connection refused
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/src/sdks/python/scripts/run_integration_test.sh>: line 278: 18203 Terminated              python setup.py nosetests --test-pipeline-options="$PIPELINE_OPTS" --with-xunitmp --xunitmp-file=$XUNIT_FILE --ignore-files '.*py3\d?\.py$' $TEST_OPTS
:sdks:python:test-suites:dataflow:py36:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 14 mins 27.544 secs.
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

2020-04-25 01:11:43,601 d2f2b33d MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-25 01:11:43,602 d2f2b33d MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2020-04-25 01:11:43,604 d2f2b33d MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 995, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-25 01:11:43,605 d2f2b33d MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2020-04-25 01:11:43,605 d2f2b33d MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2020-04-25 01:11:43,605 d2f2b33d MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/d2f2b33d/pkb.log>
2020-04-25 01:11:43,605 d2f2b33d MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/d2f2b33d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py36 #1345

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/1345/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org