You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/05/15 00:51:02 UTC

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #1423

See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/1423/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-9967] Adding support for BQ labels on Query/Export jobs. (Roll

[kcweaver] [BEAM-9993] Add option defaults for Flink Python tests.

[github] [BEAM-9941] Added a test of a GBK followed by a Flatten with an unknown

[github] Clarify pubsub IO comment about timestamps (#11672)

[iemejia] [BEAM-9833] Add yamllint config

[iemejia] [BEAM-9833] Fix .asf.yaml issues, sort labels and disable rebase button

[github] Clarifies an error message in Katas to explain what is actually wrong.

[github] Merge pull request #11210 from [BEAM-8949] SpannerIO integration tests

[github] [BEAM-9876] Migrate the Beam website from Jekyll to Hugo to enable

[github] Update the range for pyarrow to qualify pyarrow 0.17.x (#11699)


------------------------------------------
[...truncated 152.51 KB...]
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": "eNrNWHd8HEcVvjvJbe04LsRgUriYGFYhdyfH2HFMSEjOlmMOn5WVEi/FbPZ2525W2nJvZlaygjaJ46hgAgQIoZleQu8Qeu8loWM6hE7oNVTnzeyd5BOy8H/mJ2l37+28b+e977337enGLt2xm7ZDiVUjdlAUzA55PWIBLzoRI1rZ9n275pN9zG42CdsR9YUaZHoOQjaBnG52ZTIZqx5CVweIF+FvzeZEs+peaPvedcQaZZ4gGnSbK5RLjLCswZvEgUVVcwnamixyCOewmHaby+UaMdYkFvVCwWGJ43q+X7TkUbMcRmxBrNAOiCvipo+wS5XL7r2D6HOlclmmLF7YjIVC4qCpR0exmDUtNxdLE/MaXggrpuA0M4+fO/KB69Qeii7BhNgiYhxWTsHpBqyqVvfD6ilYY8DauLYfHqDPTYIgB4QXadYgnge8cFiDMzBz6xJ4oG6uwgfVPZ9YTVtQq8lI3TsAD+pAiJroHfLiiO3HuI5FI55LmDYgbOE510hjf9sG6xH4wQmcqZsaAqcecu9wVkfq/Mh2lV2DsxV5XDA4ZwIeYkDeXNT2hHPNvXjd4NtLJUGCZoFj2HaDFLAwCiR0CyJKT4QLXmqOFTxRcPwodktpdkubtmy7eEvv5t6Ltm7r3VZihMc+UrIhrs2GLdmzeFyXYT+Uyu2fl8BGnZ5F8/RcE1dl4GG4XkYTxoHFqc1cDg+vZNQ2nQijhs6EKxtvnbTB6Iox3F85UunpQfzzE3hEzVwtg5Zg6Q5kqdsCLjDPQ3tho966FQc9vVvcQlRH0+zjpQ0K5rpOiIYf1do4RXWzcP68niUVvRMFyDbnSG3KUK/ZLYsuFhFsMpfhZeAFLfIuVPHLMio1fRtrdLO5XraG7MUQiWS253thwwrJKJ6xoB85rXrJosSWCdpSjWudRRkKwrAji7HwfO1y1ogDEop+33YIjXyVqq2YqouoPG6rZNX5YpLA9v3wKN3cKvt0TFDcOse6s1R79RZUb5f2zR53B02/1NdqfWWBSzr20VR1pllXh03PGfaJO4BwuyWaBo9O4FJd5cS1hQ2Xzec4s3wHLtHgMbjLyxO4QjdXylQ6cpDIxpLBQtk8E43Se/vsprfjnpgcbNtHNsEOVRWjXuhGo1aAyZU5xam280SjUeIoGK4p+m3f+i9vDfoUFyMeGZVgu+abYfU4dGSTa3ClTs82l+J6WbFyjMHuCXisARW9kq1k8K+rsrasTWfGM9PZQ7mBDDyuOgF7epRLOxZA016ToaVEo4CUhkg47IW8fS5w3x4hpdGIDXOMi5RkWFY/Yap4Q4cMyn629kXMLUdxKHYPWv1jm7eWOHNK3B2WnS6pLx2XlFLKR7E5Bv1qK5f4dlBz7Uvhqj2Hs+UMGOYZsuNZFFgMIWVlz+x2QA1klZ/WqIPBSbi6R8A1BuzrSFeDCAvpxPI01WNqsecLDAker3KMt+VdeMIkPNGAJ3W4ekEzYsIKIjeWSrHfXDNnxKchwJMnwDLgWgVvoa8jLAvsSagZ4NCrqvOx5xD8AC5F7pCmHNLUVVlSKZevFRmRHcqNZ1yk6lA2yY3n+Ibx7FCXmxPdhzKSPrFoHO+4XcM5tnU853avz6BtsbsotYslqa111SWv6l3r8HhT1sWVQPRqJadCd0ndxukKdVkm5vVo6S9Hvk9UWeWjep5jC+Y3uvlRT9B8gJqeF9TGOyHJE5/I9s+nDUPcvM3zNjqEDZ8I9JbUFPN9HuMiL0aj9nqeJ6EjC4Qw6YOIGzbyDReoYxEaAmjavr7HBXhqpEnORRT5HIaU5Hrc9xwCw0qDkD3wVefuDJpibKa3IVC3fRJCqAah0rydjEUMInqOgKaZU9gAKhXtImLq6XIEAp8CQfvVULbmL8J4z33Z8qpMblW2O7syuyK7NJvL5nIw0oNlOGrAAerQuFpNYKwq4DoDnjIB4wYk1E3gev0EY+MG2necSN24gEjtU77EVYG1pOogzrKbEjikm2slunrvcq1U0VPEmxdA7LO56Gde4AlvZEb9JhByMoEp83SVB3ylq9nOcAttegG0fjWcWyhPRZTDCTytxmsKSOaRCztotoBuWUiN22tbWE9HrGck8MyaGgGtFKYwty4As0vN2jRrLaRnIdKzE3iOKqs0TXDbieZ2+qAOGA2eiwi3J/A8VTUW6io8v8NfSiUvzkQ7G4sGL0DXFybwIvXwwHNYxOHI4aVHj+j33Hvs2A28Jn/iWjwBLzbgJZPw0gRehnX0cgNeEdfoTtpHdSr5fmUCr9LpzVQS9eoEXkOnqcz2HQm8FgHoLVTm63UJvL5Gb6Uy4jck8EZ6G71doU/Bm+ZX5k0npcxvpqi4b9HpZVTK6FsTeJtO/7dmvl1tf1dr9qUSla0MlLvH5Xx6ByrRO3soHt91qvTo3Z16dOeeDTlqyK5+jwHvxa6+U3b1+5CN9xvwgQ42PpjAh9psfDiBjyAbd8yw8NEEPtZm4eMJfOI4Fj45PwsXnhQLn5IsfLrNwmcS+OzJsPC5eViY+6LweWThC4qLL54qLu7q5OJufDegBh2gKPdfQka+bMBXkJG7q/RUaOlXpZbS/xv9/JqAr+vUo0N0mPo0oCFVWvcNCpRR1LNv0rsW0rOj8+vZt2Tlf9uA72Cej8rK/y5W/vcM+P4E/MCAH0o9+9GJ9OyeDj37seqRnyTw03aP/CyBn7cm1i8S+OXsxPpVAve2e+XXCfxmplcm4LcG/G4Sfp/AH3AffzTgTx0d+OcE/tJG/2sCf+vowPsS+Hsb9R8J/PO4DvwXWfCfGenEx6g0+Dc6/yeBY7qqzDTmesgyWQzz5MWDayyLHj0HWQ5P6k1HMK/RwLYJWdcCWK1V2o707W2w9ZF1p2iLJNrq9KtMHMS+LYtRvsISthhvVbLqmyTquHx5sPA2wS9/nC2Z+8S50rUjZgpKY0vTBy3DU8I0PNIjlQwK+5o5wh7U8HslY8vTp6r/y3jcar10shVono5rgp2GF8X7Aa3iYHk=",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-05-15T00:38:01.012666Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-05-14_17_37_59-16746012563062290028'
 location: 'us-central1'
 name: 'beamapp-jenkins-0515003758-099022'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-05-15T00:38:01.012666Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-05-14_17_37_59-16746012563062290028]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-05-14_17_37_59-16746012563062290028
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-14_17_37_59-16746012563062290028?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-05-14_17_37_59-16746012563062290028 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:03.403Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:03.814Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.467Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.502Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.538Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.579Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.610Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.716Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.766Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.799Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.837Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.874Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.907Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:04.941Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.028Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.064Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.098Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.136Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.173Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.208Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/Pair
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.235Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.265Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.298Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.329Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.369Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.402Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.439Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.466Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.609Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.803Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.827Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.839Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.861Z: JOB_MESSAGE_BASIC: Executing operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.875Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.934Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.934Z: JOB_MESSAGE_BASIC: Finished operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:05.991Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:06.030Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:06.066Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:16.252Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:31.835Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 9 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:31.874Z: JOB_MESSAGE_DETAILED: Resized worker pool to 9, though goal was 10.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:38:37.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:40:02.355Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:40:02.392Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.324Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.411Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.437Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.509Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.548Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.550Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.574Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.589Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.619Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.622Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.653Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:43:11.678Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:44:05.588Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:45:13.677Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:45:13.753Z: JOB_MESSAGE_BASIC: Executing operation group/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:45:13.810Z: JOB_MESSAGE_BASIC: Finished operation group/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:45:13.878Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:06.455Z: JOB_MESSAGE_BASIC: Finished operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:06.544Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:06.597Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:06.670Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.100Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.213Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/Extract.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.283Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.320Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.322Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.361Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.397Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.436Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:09.510Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:11.956Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/PreFinalize
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:12.036Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:12.103Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:12.155Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:12.233Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:12.318Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:14.882Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:14.948Z: JOB_MESSAGE_DEBUG: Executing success step success24
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:15.060Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:15.172Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:48:15.205Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:50:29.418Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 10 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:50:29.464Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-15T00:50:29.502Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-05-14_17_37_59-16746012563062290028 is in state JOB_STATE_DONE
DEBUG:apache_beam.io.filesystem:Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results'
DEBUG:apache_beam.io.filesystem:translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1589503076808\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 30 files in 0.07508325576782227 seconds.
INFO:apache_beam.testing.pipeline_verifiers:Find 30 files in gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results*-of-*: 
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00000-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00001-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00002-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00003-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00004-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00005-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00006-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00007-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00008-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00009-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00010-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00011-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00012-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00013-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00014-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00015-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00016-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00017-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00018-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00019-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00020-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00021-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00022-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00023-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00024-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00025-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00026-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00027-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00028-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1589503076808/results-00029-of-00030
Terminated
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=cc37cfed-5644-4339-8736-44c720899615, currentDir=<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 23583
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-23583.out.log
----- Last  20 lines from daemon log file - daemon-23583.out.log -----
	at org.gradle.process.internal.DefaultExecHandle.execExceptionFor(DefaultExecHandle.java:232)
	at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:209)
	at org.gradle.process.internal.DefaultExecHandle.failed(DefaultExecHandle.java:356)
	at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:86)
	at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalStateException: Shutdown in progress
	at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
	at java.lang.Runtime.removeShutdownHook(Runtime.java:239)
	at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:33)
	at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:199)
	at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:352)
	at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:107)
	at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:83)
	... 7 more
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

2020-05-15 00:51:01,750 f72351ea MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-05-15 00:51:01,752 f72351ea MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2020-05-15 00:51:01,754 f72351ea MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 995, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-05-15 00:51:01,755 f72351ea MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2020-05-15 00:51:01,755 f72351ea MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2020-05-15 00:51:01,755 f72351ea MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/f72351ea/pkb.log>
2020-05-15 00:51:01,756 f72351ea MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/f72351ea/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py36 #1424

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/1424/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org