You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/08 06:38:05 UTC

Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #228

See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/228/display/redirect>

Changes:


------------------------------------------
[...truncated 513.29 KB...]
        "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 6:36:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T06:36:29.808Z: java.lang.RuntimeException: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_050c9467-84ce-4440-82ca-026431885bcc\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 6:36:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T06:36:29.842Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 08, 2019 6:36:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T06:36:29.946Z: Workflow failed. Causes: S02:Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1208-12072234-8v1d-harness-t03v
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12072234-8v1d-harness-88nr
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12072234-8v1d-harness-73ng
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12072234-8v1d-harness-t03v
          Root cause: Work item failed.
    Dec 08, 2019 6:36:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T06:36:30.076Z: Cleaning up.
    Dec 08, 2019 6:36:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T06:36:30.172Z: Stopping worker pool...
    Dec 08, 2019 6:37:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T06:37:54.864Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 08, 2019 6:37:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T06:37:54.904Z: Worker pool stopped.
    Dec 08, 2019 6:38:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-07_22_34_03-14357764748486349957 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 050c9467-84ce-4440-82ca-026431885bcc and timestamp: 2019-12-08T06:33:57.315000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.026 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.042 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 6.8 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 46s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/rx6we22q65cca

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_BiqQueryIO_Streaming_Performance_Test_Java #238

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/238/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #237

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/237/display/redirect>

Changes:


------------------------------------------
[...truncated 211.14 KB...]
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 7.602 secs. 24134 duplicate classes found in classpath (see all with --debug).
Packing task ':runners:java-fn-execution:compileJava'
:runners:java-fn-execution:compileJava (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 21.224 secs.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:java-fn-execution:classes
Skipping task ':runners:java-fn-execution:classes' as it has no actions.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:java-fn-execution:jar
Build cache key for task ':runners:java-fn-execution:jar' is 0d153cfa410c484e3905c88fb6946d24
Caching disabled for task ':runners:java-fn-execution:jar': Caching has not been enabled for the task
Task ':runners:java-fn-execution:jar' is not up-to-date because:
  No history is available.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.052 secs.
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 6,5,main]) started.
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:direct-java:compileJava
Build cache key for task ':runners:direct-java:compileJava' is afab241bcd6155c976e5163a1017e9a6
Task ':runners:direct-java:compileJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':runners:direct-java:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.074 secs.
Packing task ':runners:direct-java:compileJava'
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 9.901 secs.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:direct-java:classes
Skipping task ':runners:direct-java:classes' as it has no actions.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava
file or directory '<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is 478bacd53fd76a3c791e1cacf0b2498d
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
file or directory '<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found
Compiling with error-prone compiler

> Task :runners:direct-java:shadowJar
Build cache key for task ':runners:direct-java:shadowJar' is 14c4f2de021c3aae1fd27e4373264f4b
Caching disabled for task ':runners:direct-java:shadowJar': Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:direct-java:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 0.79s [790ms]
Average Time/Jar: 0.1316666666667s [131.6666666667ms]
*******************
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 1.095 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 847358694231005f11869d944df56cd5
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.152 secs. 14 duplicate classes found in classpath (see all with --debug).
Packing task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava'
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 20.942 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes
Skipping task ':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no actions.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is 35fd32d799d61f670afc3dadb8ce0e9a
Caching disabled for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 4.048s [4048ms]
Average Time/Jar: 0.253s [253.0ms]
*******************
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 5.294 secs.

> Task :sdks:java:io:google-cloud-platform:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.324 secs. 1478 duplicate classes found in classpath (see all with --debug).
Packing task ':sdks:java:io:google-cloud-platform:compileTestJava'
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 16.33 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is fabaa17ac85356159207833587acf590
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.059 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) started.
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Daemon worker,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is c073dc484c62fce77f1f79b08361079d
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler

> Task :sdks:java:io:bigquery-io-perf-tests:compileTestJava
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is 57ceedb5345a650023d143474fa3dad7
Task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler
Created classpath snapshot for incremental compilation in 0.081 secs. 1478 duplicate classes found in classpath (see all with --debug).
Packing task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Daemon worker,5,main]) completed. Took 4.571 secs.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:io:bigquery-io-perf-tests:testClasses
Skipping task ':sdks:java:io:bigquery-io-perf-tests:testClasses' as it has no actions.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.019 secs. 1478 duplicate classes found in classpath (see all with --debug).
Packing task ':runners:google-cloud-dataflow-java:compileTestJava'
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 10.897 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is 3bd4f1e25615c770a7a5a128bbd65836
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.031 secs.
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is 7419e7562f579427e8e1886d2f0ba84b
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_1209150548","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.19.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_ERROR
    Dec 09, 2019 5:57:46 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Dec 09, 2019 5:57:46 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 181 files. Enable logging at DEBUG level to see which files will be staged.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalStateException: BigQuery table is not empty: apache-beam-testing:beam_performance.bqio_write_10GB_java_stream_1209150548.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:588)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers.verifyTableNotExistOrEmpty(BigQueryHelpers.java:511)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.validate(BigQueryIO.java:2331)
        at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:643)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:653)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
        at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
        at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460)
        at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:579)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:314)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:180)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testJsonWrite(BigQueryIOIT.java:143)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:128)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.041 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 7.763 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 48s
80 actionable tasks: 75 executed, 5 from cache

Publishing build scan...
https://scans.gradle.com/s/tp7u3rmk45uvq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/236/display/redirect>

Changes:


------------------------------------------
[...truncated 265.66 KB...]
    SEVERE: 2019-12-09T17:18:26.002Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.ApplianceShuffleReader.readIncludingPosition(Native Method)
    	at org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:58)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)

    Dec 09, 2019 5:22:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:22:57.409Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 5:28:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:28:57.390Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 5:30:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T17:30:36.660Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.ApplianceShuffleReader.readIncludingPosition(Native Method)
    	at org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:58)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)

    Dec 09, 2019 5:34:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:34:16.890Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Dec 09, 2019 5:34:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T17:34:17.579Z: Workflow failed. Causes: S04:Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1209-12090834-ru94-harness-wmpq
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090834-ru94-harness-c4nc
          Root cause: The worker testpipeline-jenkins-1209-12090834-ru94-harness-c4nc has been reported dead. Aborting lease 67448562114746423.,
      testpipeline-jenkins-1209-12090834-ru94-harness-tc38
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090834-ru94-harness-tc38
          Root cause: The worker lost contact with the service.
    Dec 09, 2019 5:34:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:34:17.729Z: Cleaning up.
    Dec 09, 2019 5:34:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:34:17.857Z: Stopping worker pool...
    Dec 09, 2019 5:38:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:38:29.128Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 09, 2019 5:38:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T17:38:29.163Z: Worker pool stopped.
    Dec 09, 2019 5:38:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-09_08_34_45-12997337172864520920 failed with status FAILED.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.NoClassDefFoundError: com/google/api/client/json/jackson/JacksonFactory
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryRpcFactory.create(BigQueryOptions.java:54)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryRpcFactory.create(BigQueryOptions.java:48)
        at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:509)
        at com.google.cloud.bigquery.BigQueryOptions.getBigQueryRpcV2(BigQueryOptions.java:116)
        at com.google.cloud.bigquery.BigQueryImpl.<init>(BigQueryImpl.java:139)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:44)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:38)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:497)
        at org.apache.beam.sdk.testutils.publishing.BigQueryClient.create(BigQueryClient.java:68)
        at org.apache.beam.sdk.testutils.publishing.BigQueryResultsPublisher.create(BigQueryResultsPublisher.java:39)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publish(IOITMetrics.java:68)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.extractAndPublishTime(BigQueryIOIT.java:198)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:182)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testJsonWrite(BigQueryIOIT.java:143)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:128)

        Caused by:
        java.lang.ClassNotFoundException: com.google.api.client.json.jackson.JacksonFactory
            at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
            at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
            ... 15 more

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > classMethod FAILED
    java.lang.NoClassDefFoundError: com/google/api/client/json/jackson/JacksonFactory
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryRpcFactory.create(BigQueryOptions.java:54)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryRpcFactory.create(BigQueryOptions.java:48)
        at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:509)
        at com.google.cloud.bigquery.BigQueryOptions.getBigQueryRpcV2(BigQueryOptions.java:116)
        at com.google.cloud.bigquery.BigQueryImpl.<init>(BigQueryImpl.java:139)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:44)
        at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:38)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:497)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.tearDown(BigQueryIOIT.java:116)

        Caused by:
        java.lang.ClassNotFoundException: com.google.api.client.json.jackson.JacksonFactory
            at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
            at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
            ... 9 more

2 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 1 hrs 4 mins 23.672 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 28s
80 actionable tasks: 58 executed, 22 from cache

Publishing build scan...
https://scans.gradle.com/s/qf7g4m4pldp62

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_BiqQueryIO_Streaming_Performance_Test_Java - Build # 235 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_BiqQueryIO_Streaming_Performance_Test_Java (build #235)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/235/ to view the results.

Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #234

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/234/display/redirect>

Changes:


------------------------------------------
[...truncated 278.77 KB...]
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)

    Dec 09, 2019 1:41:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:41:54.300Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 1:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:47:54.263Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 1:50:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T13:50:00.884Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
    	at org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
    	at org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)

    Dec 09, 2019 1:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:53:54.263Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 1:59:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:59:54.390Z: Checking permissions granted to controller Service Account.
    Dec 09, 2019 2:03:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:30.169Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Dec 09, 2019 2:03:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T14:03:30.370Z: Workflow failed. Causes: S04:Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1209-12090523-t2aj-harness-pwd0
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-lq9p
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-lq9p
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-2v28
          Root cause: The worker lost contact with the service.
    Dec 09, 2019 2:03:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:30.513Z: Cleaning up.
    Dec 09, 2019 2:03:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:31.085Z: Stopping worker pool...
    Dec 09, 2019 2:07:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:07:47.541Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 09, 2019 2:07:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:07:47.606Z: Worker pool stopped.
    Dec 09, 2019 2:07:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-09_05_23_35-7411825093350387049 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 4928f165-dbe1-4286-84b1-961de06a1725 and timestamp: 2019-12-09T13:23:29.293000000Z:
                     Metric:                    Value:
                  write_time                   140.456

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_ERROR
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 183 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    WARNING: Skipping non-existent file to stage <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/resources/test.>
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    WARNING: Skipping non-existent file to stage <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/classes/java/main.>
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    WARNING: Skipping non-existent file to stage <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/resources/main.>
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 178 files cached, 0 files newly uploaded in 0 seconds
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/TriggerIdCreation/Read(CreateSource) as step s1
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map as step s2
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey as step s3
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues as step s4
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map as step s5
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) as step s6
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly as step s7
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) as step s8
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView as step s9
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/Read(BigQueryTableSource) as step s10
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/ParMultiDo(Identity) as step s11
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s12
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/View.AsIterable/CreateDataflowView as step s13
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/Create(CleanupOperation)/Read(CreateSource) as step s14
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/Cleanup as step s15
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s16
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <48505 bytes, hash YzbfkhBux1jcRt1EPMbWkg> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YzbfkhBux1jcRt1EPMbWkg.pb
    Dec 09, 2019 2:07:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.19.0-SNAPSHOT
    Dec 09, 2019 2:07:58 PM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs. 

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: (9182231a18263cc9): The workflow could not be created. Causes: (46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 300 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support.
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:974)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:188)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testRead(BigQueryIOIT.java:190)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:131)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
        {
          "code" : 400,
          "errors" : [ {
            "domain" : "global",
            "message" : "(9182231a18263cc9): The workflow could not be created. Causes: (46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 300 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support.",
            "reason" : "failedPrecondition"
          } ],
          "message" : "(9182231a18263cc9): The workflow could not be created. Causes: (46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 300 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support.",
          "status" : "FAILED_PRECONDITION"
        }
            at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
            at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
            at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
            at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
            at org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61)
            at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:960)
            ... 5 more

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.036 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.054 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 44 mins 31.608 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 47m 16s
80 actionable tasks: 79 executed, 1 from cache

Publishing build scan...
https://scans.gradle.com/s/h3fmtx25u5w7m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #233

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/233/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8895] Add BigQuery table name sanitization to BigQueryIOIT

[michal.walenia] [BEAM-8918] Split batch BQIOIT into avro and json using tests


------------------------------------------
[...truncated 199.19 KB...]
  No history is available.
Custom actions are attached to task ':sdks:java:harness:shadowJar'.

> Task :sdks:java:core:shadowTestJar
Build cache key for task ':sdks:java:core:shadowTestJar' is e281e1fb529b1f0c09c29434ae0ce49a
Caching disabled for task ':sdks:java:core:shadowTestJar': Caching has not been enabled for the task
Task ':sdks:java:core:shadowTestJar' is not up-to-date because:
  No history is available.
*******************
GRADLE SHADOW STATS

Total Jars: 5 (includes project)
Total Time: 1.802s [1802ms]
Average Time/Jar: 0.3604s [360.4ms]
*******************
:sdks:java:core:shadowTestJar (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3.502 secs.
:sdks:java:extensions:google-cloud-platform-core:compileTestJava (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:google-cloud-platform-core:compileTestJava FROM-CACHE
Build cache key for task ':sdks:java:extensions:google-cloud-platform-core:compileTestJava' is 9fdec4b1d291a168a5f3210f81f0f3ad
Task ':sdks:java:extensions:google-cloud-platform-core:compileTestJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@1541a26a: {executionTime=5946, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=kusubjxzfjeqpa264jv4o45xsm, creationTime=1575659896443, identity=:sdks:java:extensions:google-cloud-platform-core:compileTestJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java11_ValidatesRunner_Dataflow/src}
Unpacked trees for task ':sdks:java:extensions:google-cloud-platform-core:compileTestJava' from cache.
:sdks:java:extensions:google-cloud-platform-core:compileTestJava (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.365 secs.
:sdks:java:extensions:google-cloud-platform-core:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:google-cloud-platform-core:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:google-cloud-platform-core:testClasses' as it has no actions.
:sdks:java:extensions:google-cloud-platform-core:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:google-cloud-platform-core:testJar (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:google-cloud-platform-core:testJar
Build cache key for task ':sdks:java:extensions:google-cloud-platform-core:testJar' is da19b672e4cf9a1cd9e1c7f49c9b4030
Caching disabled for task ':sdks:java:extensions:google-cloud-platform-core:testJar': Caching has not been enabled for the task
Task ':sdks:java:extensions:google-cloud-platform-core:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:google-cloud-platform-core:testJar (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.017 secs.

> Task :sdks:java:harness:shadowJar
*******************
GRADLE SHADOW STATS

Total Jars: 53 (includes project)
Total Time: 7.42s [7420ms]
Average Time/Jar: 0.14s [140.0ms]
*******************
:sdks:java:harness:shadowJar (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 7.974 secs.
:runners:java-fn-execution:compileJava (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :runners:java-fn-execution:compileJava FROM-CACHE
Build cache key for task ':runners:java-fn-execution:compileJava' is 94fe7e5796c2553a8cd29fd58ff76cea
Task ':runners:java-fn-execution:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@487d4343: {executionTime=15398, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=yuka67ek55arrmxobxt45gsnx4, creationTime=1575644222638, identity=:runners:java-fn-execution:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Go/src}
Unpacked trees for task ':runners:java-fn-execution:compileJava' from cache.
:runners:java-fn-execution:compileJava (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2.481 secs.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :runners:java-fn-execution:classes UP-TO-DATE
Skipping task ':runners:java-fn-execution:classes' as it has no actions.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 4,5,main]) started.

> Task :runners:java-fn-execution:jar
Build cache key for task ':runners:java-fn-execution:jar' is 0d153cfa410c484e3905c88fb6946d24
Caching disabled for task ':runners:java-fn-execution:jar': Caching has not been enabled for the task
Task ':runners:java-fn-execution:jar' is not up-to-date because:
  No history is available.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.051 secs.
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 4,5,main]) started.
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :runners:direct-java:compileJava FROM-CACHE
Build cache key for task ':runners:direct-java:compileJava' is 227902d6b7b2f7f16cf9baa40b3521b2
Task ':runners:direct-java:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@2b4a4982: {executionTime=10992, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=vgfuolsrava4poqijk5yke45te, creationTime=1575644233351, identity=:runners:direct-java:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_ValidatesRunner_Spark/src}
Unpacked trees for task ':runners:direct-java:compileJava' from cache.
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.082 secs.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 4,5,main]) started.

> Task :runners:direct-java:classes UP-TO-DATE
Skipping task ':runners:direct-java:classes' as it has no actions.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava FROM-CACHE
file or directory '<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is b031d445155c010cf7f8c854c9ae7919
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@6b2a517b: {executionTime=22661, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=mzm54k3xknauzojffqnoj2f4he, creationTime=1575663202334, identity=:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java/src}
Unpacked trees for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' from cache.
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.298 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no actions.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :runners:direct-java:shadowJar
Build cache key for task ':runners:direct-java:shadowJar' is dceaba7ea66116cde400c6b27150b630
Caching disabled for task ':runners:direct-java:shadowJar': Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:direct-java:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 0.632s [632ms]
Average Time/Jar: 0.1053333333333s [105.3333333333ms]
*******************
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.88 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 7e851710cbfebbd8caf52b248ea3a37e
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@740a6efd: {executionTime=9631, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=kusubjxzfjeqpa264jv4o45xsm, creationTime=1575659909094, identity=:sdks:java:io:google-cloud-platform:compileTestJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java11_ValidatesRunner_Dataflow/src}
Unpacked trees for task ':sdks:java:io:google-cloud-platform:compileTestJava' from cache.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.355 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is dc10b013b2d3055a4b8af815790253a9
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.056 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 8,5,main]) started.
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 548038e8c8dcfc04130ba21c7412ad7b
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@76c612ad: {executionTime=6216, hostName=apache-beam-jenkins-10, operatingSystem=Linux, buildInvocationId=kusubjxzfjeqpa264jv4o45xsm, creationTime=1575659915427, identity=:runners:google-cloud-dataflow-java:compileTestJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java11_ValidatesRunner_Dataflow/src}
Unpacked trees for task ':runners:google-cloud-dataflow-java:compileTestJava' from cache.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.242 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is 3bd4f1e25615c770a7a5a128bbd65836
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.027 secs.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is 8afd6da254fbee9c2c1318ce7ed3c589
Caching disabled for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 4.121s [4121ms]
Average Time/Jar: 0.2575625s [257.5625ms]
*******************
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 5.026 secs.

> Task :sdks:java:io:bigquery-io-perf-tests:compileTestJava
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is 6eed03d2b06716692ac7b55fa7b88ee4
Task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler
Created classpath snapshot for incremental compilation in 1.823 secs. 1478 duplicate classes found in classpath (see all with --debug).
Packing task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 7.336 secs.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) started.

> Task :sdks:java:io:bigquery-io-perf-tests:testClasses
Skipping task ':sdks:java:io:bigquery-io-perf-tests:testClasses' as it has no actions.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is ce82e3faa86eef5af06956c56916f53d
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.19.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > classMethod FAILED
    java.lang.NullPointerException: Name is null
        at java.lang.Enum.valueOf(Enum.java:236)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$WriteFormat.valueOf(BigQueryIOIT.java:261)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.setup(BigQueryIOIT.java:106)

1 test completed, 1 failed
Finished generating test XML results (0.015 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4.088 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53s
80 actionable tasks: 53 executed, 27 from cache

Publishing build scan...
https://scans.gradle.com/s/gns733i76spsq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #232

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/232/display/redirect>

Changes:


------------------------------------------
[...truncated 523.23 KB...]
        "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 09, 2019 6:36:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T06:36:37.808Z: java.lang.RuntimeException: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_9b73ebab-8df5-4909-9c37-5bfd4803e22b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 09, 2019 6:36:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T06:36:37.858Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 09, 2019 6:36:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T06:36:37.973Z: Workflow failed. Causes: S02:Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1209-12082234-dsp9-harness-2hcq
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12082234-dsp9-harness-1sp9
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12082234-dsp9-harness-sjb4
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12082234-dsp9-harness-sjb4
          Root cause: Work item failed.
    Dec 09, 2019 6:36:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T06:36:38.111Z: Cleaning up.
    Dec 09, 2019 6:36:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T06:36:38.257Z: Stopping worker pool...
    Dec 09, 2019 6:38:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T06:38:19.381Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 09, 2019 6:38:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T06:38:19.440Z: Worker pool stopped.
    Dec 09, 2019 6:38:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-08_22_34_02-4339219542582996891 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 9b73ebab-8df5-4909-9c37-5bfd4803e22b and timestamp: 2019-12-09T06:33:56.502000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.032 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 33.085 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/d2sneyimi5lr6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #231

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/231/display/redirect>

Changes:


------------------------------------------
[...truncated 408.98 KB...]
        "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 09, 2019 12:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T00:36:03.023Z: java.lang.RuntimeException: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_e708b6f4-49a2-47e3-a112-dc1bc814014b\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 09, 2019 12:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T00:36:03.061Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 09, 2019 12:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T00:36:03.173Z: Workflow failed. Causes: S02:Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1209-12081634-n6er-harness-z0kv
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12081634-n6er-harness-v5k8
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12081634-n6er-harness-z0kv
          Root cause: Work item failed.,
      testpipeline-jenkins-1209-12081634-n6er-harness-3z22
          Root cause: Work item failed.
    Dec 09, 2019 12:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T00:36:03.306Z: Cleaning up.
    Dec 09, 2019 12:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T00:36:03.414Z: Stopping worker pool...
    Dec 09, 2019 12:37:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T00:37:55.124Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 09, 2019 12:37:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T00:37:55.162Z: Worker pool stopped.
    Dec 09, 2019 12:38:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-08_16_34_13-14241203845977232369 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): e708b6f4-49a2-47e3-a112-dc1bc814014b and timestamp: 2019-12-09T00:34:06.695000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.029 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.047 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 56.915 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 40s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/pwe6yfqq4vras

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #230

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/230/display/redirect>

Changes:


------------------------------------------
[...truncated 482.88 KB...]
        "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 6:43:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T18:43:52.786Z: java.lang.RuntimeException: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_1d3d2b4f-dc93-4063-998a-6a84b24c8c38\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 6:43:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T18:43:52.840Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 08, 2019 6:43:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T18:43:52.948Z: Workflow failed. Causes: S02:Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1208-12081041-fyxi-harness-6d92
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12081041-fyxi-harness-41bq
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12081041-fyxi-harness-sk40
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12081041-fyxi-harness-41bq
          Root cause: Work item failed.
    Dec 08, 2019 6:43:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T18:43:53.067Z: Cleaning up.
    Dec 08, 2019 6:43:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T18:43:53.180Z: Stopping worker pool...
    Dec 08, 2019 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T18:45:40.381Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 08, 2019 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T18:45:40.437Z: Worker pool stopped.
    Dec 08, 2019 6:45:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-08_10_41_42-2770470705281273193 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 1d3d2b4f-dc93-4063-998a-6a84b24c8c38 and timestamp: 2019-12-08T18:41:35.900000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.027 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.041 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 16.754 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 55s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/zkzcz5orbwcvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #229

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/229/display/redirect>

Changes:


------------------------------------------
[...truncated 513.32 KB...]
        "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 12:36:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T12:36:18.888Z: java.lang.RuntimeException: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID \"bqio_write_10GB_java_47863922-7ff7-40bd-9dc3-aa78bf41a657\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
    	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
    	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
    	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
    	at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    	at org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
    	at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
    	at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
    	at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
    	at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
    	at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

    Dec 08, 2019 12:36:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T12:36:19.233Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 08, 2019 12:36:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-08T12:36:19.681Z: Workflow failed. Causes: S02:Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
      testpipeline-jenkins-1208-12080434-7hqz-harness-sxb6
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12080434-7hqz-harness-9t59
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12080434-7hqz-harness-1knp
          Root cause: Work item failed.,
      testpipeline-jenkins-1208-12080434-7hqz-harness-kv5x
          Root cause: Work item failed.
    Dec 08, 2019 12:36:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T12:36:19.919Z: Cleaning up.
    Dec 08, 2019 12:36:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T12:36:20.034Z: Stopping worker pool...
    Dec 08, 2019 12:37:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T12:37:47.897Z: Autoscaling: Resized worker pool from 5 to 0.
    Dec 08, 2019 12:37:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-08T12:37:47.946Z: Worker pool stopped.
    Dec 08, 2019 12:37:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-08_04_34_04-6294710322421591517 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 47863922-7ff7-40bd-9dc3-aa78bf41a657 and timestamp: 2019-12-08T12:33:57.217000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.031 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.041 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 58.943 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 38s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/qocn2oxtuh36k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org